Preface
In recent years, machine learning has changed from a niche technology asset for scientific
and theoretical experts to a ubiquitous theme in the day-to-day operations of the majority of
the big players in the IT field.
This phenomenon started with the explosion in the volume of available data: During the
second half of the 2000s, the advent of many kinds of cheap data capture devices
(cellphones with integrated GPS, multi-megapixel cameras, and gravity sensors), and the
popularization of new high-dimensional data capture (3D LIDAR and optic systems, the
explosion of IOT devices, etc), made it possible to have access to a volume of information
never seen before.
Additionally, in the hardware field, the almost visible limits of the Moore law, prompted
the development of massive parallel devices, which multiplied the data to be used to train a
determined models.
Both advancements in hardware and data availability allowed researchers to apply
themselves to revisit the works of pioneers on human vision-based neural network
architectures (convolutional neural networks, among others), finding many new problems
in which to apply them, thanks to the general availability of data and computation
capabilities.
To solve these new kinds of problems, a new interest in creating state-of-the-art machine
learning packages was born, with players such as: Keras, Scikyt-learn, Theano, Caffe, and
Torch, each one with a particular vision of the way machine learning models should be
defined, trained, and executed.
On 9 November 2015, Google entered into the public machine learning arena, deciding to
open-source its own machine learning framework, TensorFlow, on which many internal
projects were based. This first 0.5 release had a numbers of shortcomings in comparison
with others, a number of which were addressed later, specially the possibility of running
distributed models.
So this little story brings us to this day, where TensorFlow is one of the main contenders
for interested developers, as the number of projects using it as a base increases, improving
its importance for the toolbox of any data science practitioner.
In this book, we will implement a wide variety of models using the TensorFlow library,
aiming at having a low barrier of entrance and providing a detailed approach to the
problem solutions.