Enterprises that have and want to utilize big data, neither machine learning solution. TensorFlow is commonly used for deep learning, classification and predictions, image recognition and transfer learning. So it's portability to multiple platforms and devices in production readiness, can solve complex business and academic research problems. Used in production environments with Keras, TensorFlow 2.0 gives you the ultimate machine learning solution. Let's recap how far you have come on your journey, to learn about TensorFlow 2.0. We started our journey to TensorFlow 2.0, by first talking about core TensorFlow, this is TensorFlow has a numeric programming library. We also showed you that TensorFlow is an open source portable, powerful, production writing software to do numerical computing, any numeric computation, not just machine learning. TensorFlow as a numeric programming library, is appealing because you can write your own computation code in a high level language like Python, and have it be executed very quickly at runtime. And you saw that the way core TensorFlow works is that you create a directed acyclic graph or dag, to represent the computation you want to do, like addition, multiplication or subtraction. Recall that DAGs are used in models to illustrate the flow of information through a system, and is simply a graph that flows in one direction with nodes, a place to store the data, and directed edges, arrows that point in one direction. These arrows are arrays of data, or tensors. A tensor is simply an array of data with its dimensioning determining its rank so, your data in tensor flow are tensors, they flow through the graph, hence the name, tensor flow. We discussed why TensorFlow uses DAGs to represent computation, because of portability. TensorFlow applications can be run on most any platform, local machines, cloud clusters, iOS and Android devices, CPUs, GPUs, or TPUs. We saw that TensorFlow contains multiple abstraction layers, tf.estimator, tf.keras and tf.data. We also showed you how it is possible to run TensorFlow, at scale with AI platform. We then took a deep dive into the components of tensors and variables, and then next we looked at how to design and build a tensor flow input data pipeline. Data is the crucial component of a machine learning model. Collecting the right data is not enough, you also need to make sure you put the right processes in place to clean, analyze, and transform the data as needed so that the model can run as efficiently as possible. We provided labs to show you how to load CSV and non pie data, load image data, and use feature columns. But models which are deployed in production require lots and lots of data, this is data that likely won't fit in memory and can possibly be spread across multiple files, or maybe coming from an input pipeline. We looked at the tf.data API, which enables you to build complex input pipelines from simple reusable pieces. The tf.data API makes it possible to handle large amounts of data, read from different data formats, and perform complex transformations. We provided labs to show you how to manipulate data with the TensorFlow data set API. You saw that the data set API will help you create input functions for your model, that will load data progressively. There are specialized data set classes that can read data from text files like CSVs, tensor flow records, or fixed length record files. We also provided an optional lab on how to do feature analysis using tensor flow data validation and facets of visualization tool. Lastly, we looked at working in memory and with files, when data used to train a model sits in memory we can create an input pipeline by constructing a data set using tf.data.Dataset from tensors, or tf.Dataset from tensor slices. In Module three, we showed you how to build and train a deep neural network with TensorFlow 2, and that meant that we had to use to carry a sequential API, which made it really easy to have such, we provided labs on linear regression using TensorFlow 2.0, and logistic regression. We also provided an optional advanced logistic regression lab using TensorFlow 2.0 as well. We shared with you use cases for using the Keras functional API. We showed you how to train a neural network, using the Keras functional API, and in the module, we discussed embeddings and how to create them with the feature column API. We also discuss deep and wide models and when to use them, we also talked about how regularization can help improve the performance of a model, we also discussed how to deploy a saved model, and make predictions using G Cloud AI platform. And, we provided the lab on the Keras functional API using TensorFlow 2.0, as well as a practice quiz on serving models in the cloud. So let's summarize your journey, you've seen that TensorFlow 2.0, using the Keras sequential and functional API, can help anyone from new users, to engineers, and researchers build simple models, as well as standard use case models, and models requiring increasing control. You've also learned that when it comes to model training, you can use methods such as model.fit for quick experiments, model.fit plus callback to customize your training loop, and custom training loop with gradient tape, for complete control. You've also seen that the training flow in TensorFlow 2.0 is unambiguous with components that allow you to design, build, train, distribute, analyze and save your model. The deployment phases also on ambiguous, with components that allow you to deploy to multiple device platforms. In closing, during your journey to learn TensorFlow 2.0, we have provided quizzes, a discussion forum, readings and labs, and we sincerely hope that you have found value in these offerings. So what are your next steps? Well, if you are new to machine learning, start with simple projects. TensorFlow 2.0 has built in tensor flow datasets, which have been newly added and are extremely useful for building a prototype model training pipeline. Run our labs again, but with a different data set using the Keras Sequential API, Practice adding multiple layers to your network. For advanced users or those more familiar with machine learning, use a Karas functional API with different data sets, try making new layers and models via sub classing. Take our advanced machine learning on Google Cloud course, that's next. This concludes our introduction to TensorFlow 2.0 course.