tf.nn.sampled_softmax_loss用法详解
tensorflow中具体的函数说明如下:
tf.nn.sampled_softmax_loss(weights, # Shape (num_classes, dim) - floatXXbiases, # Shape (num_classes) - floatXX labels, # Shape (batch_size, num_true) - int64inputs, # Shape (batch_size, dim) - floatXX num_sampled, # - intnum_classes, # - intnum_true=1, sampled_values=None,remove_accidental_hits=True,partition_strategy="mod",name="sampled_softmax_loss")
使用样例
import tensorflow as tf# Network Parameters
n_hidden_1 = 256 # 1st layer number of features
n_input = 784 # MNIST data input (img shape: 28*28)
n_classes = 10 # MNIST total classes (0-9 digits)# Dependent & Independent Variable Placeholders
x = tf.placeholder("float", [None, n_input])
y = tf.placeholder("float", [None, n_classes]) ## Weights and Biases
weights = {'h1': tf.Variable(tf.random_normal([n_input, n_hidden_1])),'out': tf.Variable(tf.random_normal([n_hidden_1, n_classes]))
}
biases = {'b1': tf.Variable(tf.random_normal([n_hidden_1])),'out': tf.Variable(tf.random_normal([n_classes]))
}# Super simple model builder
def tiny_perceptron(x, weights, biases):layer_1 = tf.add(tf.matmul(x, weights['h1']), biases['b1'])out_layer = tf.nn.relu(layer_1)# out_layer = tf.matmul(layer_1, weights['out']) + biases['out']return out_layer# Create the model
pred = tiny_perceptron(x, weights, biases)# Set up loss function inputs and inspect their shapes
w = tf.transpose(weights['out'])
b = biases['out']
labels = tf.reshape(tf.argmax(y, 1), [-1,1])
inputs = pred
num_sampled = 3
num_true = 1
num_classes = n_classesprint('Shapes\n------\nw:\t%s\nb:\t%s\nlabels:\t%s\ninputs:\t%s' % (w.shape, b.shape, labels.shape, inputs.shape))
# Shapes
# ------
# w: (10, 256) # Requires (num_classes, dim) - CORRECT
# b: (10,) # Requires (num_classes) - CORRECT
# labels: (?, 1) # Requires (batch_size, num_true) - CORRECT
# inputs: (?, 256) # Requires (batch_size, dim) - CORRECTloss_function = tf.reduce_mean(tf.nn.sampled_softmax_loss(weights=w,biases=b,labels=labels,inputs=inputs,num_sampled=num_sampled,num_true=num_true,num_classes=num_classes)
)
需要提到的是,这里的labels如果是one-hot类型编码,需要labels=tf.reshape(tf.argmax(labels_one_hot, 1), [-1,1])
参考地址:https://stackoverflow.com/questions/43810195/tensorflow-sampled-softmax-loss-correct-usage
tf.nn.sampled_softmax_loss用法详解相关推荐
- tf.nn.softmax参数详解以及作用
tf.nn.softmax参数详解以及作用 参考地址:https://zhuanlan.zhihu.com/p/93054123 tf.nn.softmax(logits,axis=None,name ...
- conv2d的输入_pytorch1.0中torch.nn.Conv2d用法详解
Conv2d的简单使用 torch 包 nn 中 Conv2d 的用法与 tensorflow 中类似,但不完全一样. 在 torch 中,Conv2d 有几个基本的参数,分别是 in_channel ...
- tf.nn.sampled_softmax_loss用法简单介绍
tf.nn.sampled_softmax_loss用法简单介绍 在研究Skip-gram模型时遇到了采用方式的softmax,一时没有搞明白,下面做个小案例试一下. tf.nn.sampled_so ...
- tf.nn.bidirectional_dynamic_rnn()函数详解
转载自:https://blog.csdn.net/zhylhy520/article/details/86364789 首先我们了解一下函数的参数 bidirectional_dynamic_rnn ...
- tf.nn.conv2d()函数详解(strides与padding的关系)
tf.nn.conv2d()是TensorFlow中用于创建卷积层的函数,这个函数的调用格式如下: def conv2d(input: Any,filter: Any,strides: Any,pad ...
- tf.nn.dynamic_rnn的详解
tf.nn.dynamic_rnn 其和tf.nn.static_rnn,在输入,输出,参数上有很大的区别,请仔细阅读比较 tf.nn.dynamic_rnn(cell,inputs,sequence ...
- 【pytorch系列】卷积操作原理解析与nn.Conv2d用法详解
参考: https://pytorch.org/docs/master/generated/torch.nn.Conv2d.html#torch.nn.Conv2d https://zhuanlan. ...
- python中permute_PyTorch中permute的用法详解
PyTorch中permute的用法详解 permute(dims) 将tensor的维度换位. 参数:参数是一系列的整数,代表原来张量的维度.比如三维就有0,1,2这些dimension. 例: i ...
- linux为什么用tar压缩,linux下tar压缩和解压命令用法详解
linux下tar压缩和解压命令用法详解 2017-03-25 14:06 分享人:老牛 将/usr/local/test目录下所有文件仅打包,不压缩到 /usr/local/auto_bak/目下 ...
最新文章
- HTML5学习之二:HTML5中的表单2
- 【ASP.NET】免费的WebConfig编辑工具
- oracle-Oracle试题
- C#实现DataTable按天分组并计数
- 初步学习nodejs,业余用node写个一个自动创建目录和文件的小脚本,希望对需要的人有所帮助...
- 十六、Oracle学习笔记:索引和约束(表字段快速查询和约束)
- Cloud Foundry 2018欧盟峰会日程已确定
- mysql的时间类型的比较
- 常用adb 命令整理
- 图的遍历(深度优先搜索)
- regester正则用法_Regester下载|Regester(正则表达式测试器) 官方版v2.0.1 下载_当游网...
- 100个Python实战项目(七)实现摩斯密码翻译器
- 滴滴征战澳洲 全球“追击”优步
- 市面上有哪些程序化软件?
- 安装office时总得到“安装程序包的语言不受系统支持”的提示解决方法
- Android Rotating Image Wallpaper 自动切换壁纸
- 电脑无法搜索到蓝牙耳机解决办法
- 算术编码、译码以及matlab实现
- python爬楼梯问题
- Ubuntu 22.04(LinuxMint 21)编译wine7.19安装最新微信3.7.6及QQ体会