Let us help you find the training program you are looking for.

If you can't find what you are looking for, contact us, we'll help you find it. We have over 800 training programs to choose from.

banner-img

Course Skill Level:

Foundational to Intermediate

Course Duration:

3 day/s

  • Course Delivery Format:

    Live, instructor-led.

  • Course Category:

    Big Data & Data Science

  • Course Code:

    NLPROCL21E09

Who should attend & recommended skills:

Those with Python experience and basic IT, deep learning, & Linux skills

Who should attend & recommended skills

  • This course is geared for Python experienced developers, analysts or others with Python skills who wants to create machines that understand human language using the power of Python with its ecosystem of packages dedicated to NLP and AI.
  • Skill-level: Foundation-level Natural-Language-Processing skills for Intermediate skilled team members. This is not a basic class.
  • IT Skills: Basic to Intermediate (1-5 years’ experience)
  • Deep learning: Basic (1-2 years’ experience)
  • Python skills: Intermediate (3-5 years’ experience)
  • Linux: Basic (1-2 years’ experience), including familiarity with command-line options such as ls, cd, cp, and su

About this course

Natural Language Processing is your guide to building machines that can read and interpret human language. In it, you’ll use readily available Python packages to capture the meaning in text and react accordingly. The course expands traditional NLP approaches to include neural networks, modern deep learning algorithms, and generative techniques as you tackle real-world problems like extracting dates and names, composing text, and answering free-form questions.
It covers applications to understand text and speech with extreme accuracy. The result? Chatbots that can imitate real people, meaningful resume-to-job matches, superb predictive search, and automatically generated document summaries – all at a low cost. New techniques, along with accessible tools like Keras and TensorFlow, make professional-quality NLP easier than ever before.

Skills acquired & topics covered

  • The applications to understand text and speech with extreme accuracy New techniques, along with accessible tools like Keras and TensorFlow, make professional-quality NLP easier than ever before
  • Working with Keras, TensorFlow, gensim, and scikit-learn
  • Rule-based and data-based NLP
  • Scalable pipelines

Course breakdown / modules

  • Natural language vs. programming language
  • The magic
  • Practical applications
  • Language through a computer’s eyes
  • A brief overflight of hyperspace
  • Word order and grammar
  • A chatbot natural language pipeline
  • Processing in depth
  • Natural language IQ

  • Challenges (a preview of stemming)
  • Building your vocabulary with a tokenizer
  • Sentiment

  • Bag of words
  • Vectorizing
  • Zipf’s Law
  • Topic modeling

  • From word counts to topic scores
  • Latent semantic analysis
  • Singular value decomposition
  • Principal component analysis
  • Latent Dirichlet allocation (LDiA)
  • Distance and similarity
  • Steering with feedback
  • Topic vector power

  • Neural networks, the ingredient list

  • Semantic queries and analogies
  • Word vectors

  • Learning meaning
  • Toolkit
  • Convolutional neural nets
  • Narrow windows indeed

  • Remembering with recurrent networks
  • Putting things together
  • Let’s get to learning our past selves
  • Hyperparameters
  • Predicting

  • LSTM

  • Encoder-decoder architecture
  • Assembling a sequence-to-sequence pipeline
  • Training the sequence-to-sequence network
  • Building a chatbot using sequence-to-sequence networks
  • Enhancements
  • In the real world

  • Named entities and relations
  • Regular patterns
  • Information worth extracting
  • Extracting relationships (relations)
  • In the real world

  • Pattern-matching approach
  • Grounding
  • Retrieval (search)
  • Generative models
  • Four-wheel drive
  • Design process
  • Trickery
  • In the real world

  • Too much of a good thing (data)
  • Optimizing NLP algorithms
  • Constant RAM algorithms
  • Parallelizing your NLP computations
  • Reducing the memory footprint during model training
  • Gaining model insights with Tensor Board