What is TensorFlow?

TensorFlow is a library for number crunching created and maintained by Google. It’s used mainly for machine learning (especially deep learning) tasks. While still in beta (version 1.0 is currently in alpha), the library was open sourced more than a year ago (November 9, 2015). Since then it pretty much took the Deep Learning (DL) community by a storm. Tons of companies are using it in production, also. The best place to learn more is the official page of TensorFlow.

On the more technical side, TensorFlow allows you to do computations on your PC/Mac (CPU & GPU), Android, iOS and lots more places. Of course, being created by Google, it aims to bring massive parallelism to your backprop musings. The main abstraction behind all the magic is stateful dataflow graphs.

Your data flowing through a graph in TensorFlow

Eh, Tensors?

The glossary of TensorFlow states that a tensor is:

A Tensor is a typed multi-dimensional array. For example, a 4-D array of floating point numbers representing a mini-batch of images with dimensions [batch, height, width, channel].

So, you can think of a tensor as a matrix on steroids - expanded to nn more dimensions. The concept might feel a bit strange at first, but don’t worry it will come around eventually.

Installing TensorFlow

If you want basic installation without all the fuss, just do this:

pip install tensorflow

Or install it with GPU support:

pip install tensorflow-gpu

Otherwise, you might have a look here if you want to build from source (might need this if you want to support custom cuDNN version) or whatever.

Check your setup

Now that you have everything installed. Let’s check that we can import TensorFlow.

import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline

tf.__version__

'1.0.0-alpha'

Success! As you can see, I am using version 1.0 alpha. Let’s get those tensors flowing.

Bringing the big guns

Writing TensorFlow code might require some getting use to at first. There are some concepts that you must familiarize yourself with.

Variables

Variables are pretty standard stuff. You just have to remember one thing - define them before using them in the computational graph.

Placeholders

Placeholders are used to feed in data from outside the computational graph. So, if you need to pass data to the model from outside TensorFlow, you have to define a placeholder. Each placeholder must specify a data type. You specify your data using feed_dict when running your computation.

Session

In order to run any meaningful operation on your graph, you need a Session. In sessions, we trust (not cookies), most of the time. Here is a short example:

v1 = tf.Variable(0.0)
p1 = tf.placeholder(tf.float32)
new_val = tf.add(v1, c1)
update = tf.assign(v1, new_val)with tf.Session() as sess:sess.run(tf.global_variables_initializer())for _ in range(5):sess.run(update, feed_dict={p1: 1.0})print(sess.run(v1))

5.0

Simple Linear Regression in TensorFlow

This very well known model is a good way to start your exploration in TensorFlow. It is described by the following equation:

Y=aX+bY=aX+b

Where Y is the dependent and X is the independent variable. Our task is to adjust the parameters a - “slope” and b - “intercept” so that we best describe the data using a line.

For our example, let’s find out how eating burgers affect your resting heart rate. The data will be simulated, so no conclusions, please! Our data represents the average number of burgers eaten per day.

X = np.random.rand(100).astype(np.float32)

The slope and intercept we are looking for are respectively a=50a=50 and b=40b=40.

a = 50
b = 40
Y = a * X + b

Let’s have a look at what our model should figure out:

plt.plot(X, Y);

Let’s make things a tiny bit more interesting by adding a bit of noise to our dependent variable.

Y = np.vectorize(lambda y: y + np.random.normal(loc=0.0, scale=0.05))(Y)

a_var = tf.Variable(1.0)
b_var = tf.Variable(1.0)
y_var = a_var * X + b_var

Our task will be to minimize the mean squared error or in TensorFlow parlance - reduce the mean.

loss = tf.reduce_mean(tf.square(y_var - Y))

So, let’s try to minimize it using gradient descent.

optimizer = tf.train.GradientDescentOptimizer(0.5)
train = optimizer.minimize(loss)

Let’s use our optimizer for 300 steps of learning

TRAINING_STEPS = 300
results = []
with tf.Session() as sess:sess.run(tf.global_variables_initializer())for step in range(TRAINING_STEPS):results.append(sess.run([train, a_var, b_var])[1:])

Let’s get the final and best predictions for aa and bb

final_pred = results[-1]
a_hat = final_pred[0]
b_hat = final_pred[1]
y_hat = a_hat * X + b_hatprint("a:", a_hat, "b:", b_hat)

a: 50.0 b: 40.0

plt.plot(X, Y);
plt.plot(X, y_hat);

That’s a nice fit. Those two lines overlap pretty good, what did you expect? Pretty good for a couple of lines of code.

What we’ve done so far

There you have it. Eating hamburgers affect your health in a bad way (probably that one is true). Most importantly, you know a bit of TensorFlow and how to do a simple linear regression. Next up - deep neural networks.

References

Getting to Know TensorFlow 
Learning TensorFlow Basics 
Deep Learning with TensorFlow

原文地址: http://curiousily.com/data-science/2017/01/22/tensorflow-for-hackers-part-1.html

TensorFlow for Hackers - Part I相关推荐

  1. TensorFlow for Hackers (Part VII) - Credit Card Fraud Detection using Autoencoders in Keras

    It's Sunday morning, it's quiet and you wake up with a big smile on your face. Today is going to be ...

  2. TensorFlow for Hackers (Part VI) - Human Activity Recognition using LSTMs on Android

    Ever wondered how your smartphone, smartwatch or wristband knows when you're walking, running or sit ...

  3. TensorFlow for Hackers - Part III

    Have you ever stood still, contemplating about how cool would it be to build a model that can distin ...

  4. TensorFlow for Hackers - Part II

    In this one, you will learn how to create a Neural Network (NN) and use it for deciding whether a st ...

  5. TensorFlow – A Collection of Resources

    from: http://tm.durusau.net/?p=65606 Another Word For It Patrick Durusau on Topic Maps and Semantic ...

  6. TensorFlow手把手教你概率编程:TF Probability内置了开源教材,新手友好

    晓查 栗子 发自 凹非寺  量子位 出品 | 公众号 QbitAI 大家可能知道,要做概率编程 (Probabilistic Programming) 的话,TensorFlow Probabilit ...

  7. tensorflow 机器学习资料及其工具库

    C 通用机器学习 Recommender - 一个产品推荐的C语言库,利用了协同过滤. 计算机视觉 CCV - C-based/Cached/Core Computer Vision Library ...

  8. TensorFlow和Caffe、MXNet、Keras等其他深度学习框架的对比

    2019独角兽企业重金招聘Python工程师标准>>> TensorFlow和Caffe.MXNet.Keras等其他深度学习框架的对比 博客分类: 深度学习 Google 近日发布 ...

  9. TensorFlow Probability 概率编程入门级实操教程

    雷锋网 AI 科技评论按:TensorFlow Probability(TFP)是一个基于 TensorFlow 的 Python 库,能够更容易地结合概率模型和深度学习.数据科学家.统计学以及机器学 ...

最新文章

  1. usr/bin/ld: cannot find 错误解决方法和 /etc/ld.so.conf
  2. PHP 错误与异常的日志记录
  3. 实战SSM_O2O商铺_04自下而上逐步整合SSM
  4. Javafx的WebEgine执行window对象设置属性后为undefined
  5. ancestral 箭头符号,译林版《牛津高中英语》模块五 高二上学期
  6. Ros学习笔记(四)创建Ros消息
  7. lcd12864历程C语言程序,基于51单片机的LCD12864程序设计
  8. 基于三菱PLC的全自动洗衣机控制系统设计
  9. 状态压缩dp学习小记part1
  10. 使用FudanNLP实现依存句法分析
  11. 【Android Studio】一款简易appUI界面开发(2)
  12. 华为首款鸿蒙平板发布,华为MatePad Pro发布亮相!华为首款鸿蒙平板全新体验!...
  13. 开启霍比特人之意外内核优化旅行 -- 专栏序言
  14. coj 1256 天朝的单行道
  15. SuperMap iDesktop 之 BIM优化流程——建筑篇
  16. Vue.js学习笔记—shop-bus:实战:利用计算属性、指令等知识开发购物车
  17. 计算机图形图像学的专业特性,计算机图形学第1_5章课后习题参考答案
  18. 怎么退出自适应巡航_减少系统干扰 体验ACC自适应巡航
  19. The Symantec Backup Exec Management Plug-in for VMware
  20. IE浏览器里面链接点击在Chrome浏览器打开

热门文章

  1. Eclipse 4.4.2 取消空格键代码上屏
  2. NSDictionary和NSMutableDictionary good
  3. 卡尔曼滤波MATLAB代码实现
  4. 吴恩达 coursera ML 第十一课总结+作业答案
  5. 6.5 不同类型的数据集
  6. java lamda循环条件_Java lambda 循环累加求和代码
  7. Matlab循环读取txt文件并对其中数据进行计算最后导出为excel
  8. excel大数据重采样批量操作(每两列中插入新一列,新一列为左右两列的平均值)
  9. [云炬看世界]个人站长
  10. 云炬随笔20160910