AI for Text, Images and Forecasting – 3 Days

Course Description – AI for Text, Images and Forecasting

Today, there is a great need for the introduction of AI into all aspects of software, making the enterprise software smart. In fact, many companies have declared this year the “The year of AI for them.”  Much of enterprise software can benefit from AI. The argument one often hears is, “If our smartphones can do it, why can’t my enterprise software?”  This AI for Text, Images and Forecasting course addresses the need for software made smart through the use of AI.

The course is intended for software architects and engineers. It gives them a practical level of experience, achieved through a combination of about 50% lecture, 50% demo work with student’s participation.  (Each delivery is expected to emphasize one of the three areas: text, images, or forecasting.  This can be determined before the class or at the beginning of delivery.)

Intended Audience

Software Architects, Developers


  • Familiarity with any programming language
  • Be able to navigate Linux command line
  • Basic knowledge of command line Linux editors (VI / nano)

Lab Environment

Working environment will be provided for students.  Students would only need an SSH client and a browser.
Zero Install: There is no need to install software on students’ machines.


  • AI overview 
    • A brief history of AI
    • Types of AI systems
    • Training machine learning models
    • Applying models for prediction
    • Demos and Labs


    AI with TensorFlow and Keras

    • Google democratization of AI with TensorFlow
    • Introducing TensorFlow
      • TensorFlow intro
      • TensorFlow Features
      • TensorFlow Versions
      • GPU and TPU scalability
      • Lab: Setting up and Running TensorFlow
      • The Tensor: The Basic Unit of TensorFlow
      • Introducing Tensors
      • TensorFlow Execution Model
      • Lab: Learning about Tensors
    • Introducing Perceptrons
      • Single Layer Linear Perceptron Classifier With TensorFlow
      • Linear Separability and Xor Problem
      • Activation Functions
      • Softmax output
      • Backpropagation, loss functions, and Gradient Descent
      • Lab: Single-Layer Perceptron in TensorFlow
    • Hidden Layers: Intro to Deep Learning
      • Hidden Layers as a solution to XOR problem
      • Distributed Training with TensorFlow
      • Vanishing Gradient Problem and ReLU
      • Loss Functions
      • Lab: Feedforward Neural Network Classifier in TensorFlow
    • High-level Tensorflow: tf.learn
      • Using high-level TensorFlow
      • Developing a model with tf.learn
      • Lab: Developing a tf.learn model
    • Convolutional Neural Networks in Tensorflow
      • Introducing CNNs
      • CNNs in Tensorflow
      • Lab: CNN apps
    • Introducing Keras
      • What is Keras?
      • Using Keras with a Tensorflow Backend
      • Lab: Example with a Keras
    • Recurrent Neural Networks in Tensorflow
      • Introducing RNNs
      • RNNs in Tensorflow
      • Lab: RNN
      • Long Short-Term Memory (LSTM) in Tensorflow
    • Text processing elements
      • TF-IDF
      • Word2vec
      • Tokenizers, n-grams
      • Stopword removal
      • Text processing pipelines
    • Image processing elements
      • Convolutions
      • Pooling
      • Edge Detection
      • De-noising
    • Time series processing and forecasting elements
      • Traditional Time Series forecasting with ARIMA models
      • Defining Autocorrelation
      • Understanding the Dickey-Fuller Test
    • Forecasting with TensorFlow and Keras
      • Using RNN and LSTM in time series prediction.

    Validation and metrics of Time Series Prediction models 
Print Friendly, PDF & Email