没有合适的资源?快使用搜索试试~ 我知道了~
首页自监督学习教程(来源于牛津大学&DeepMind)
自监督学习教程(来源于牛津大学&DeepMind)
需积分: 40 733 浏览量
更新于2023-05-26
评论 2
收藏 8.97MB PDF 举报
自监督学习是一个很有前途的替代方法,其中开发的代理任务允许模型和代理在没有明确监督的情况下学习,这有助于对感兴趣的任务的下游性能。自监督学习的主要好处之一是提高数据效率:用较少的标记数据或较少的环境步骤(在强化学习/机器人技术中)实现可比较或更好的性能。
资源详情
资源评论
资源推荐

Self-Supervised Learning
Andrew Zisserman
July 2019
Slides from: Carl Doersch, Ishan Misra, Andrew Owens, AJ Piergiovanni, Carl Vondrick, Richard Zhang

1000 categories
• Training: 1000 images for each category
• Testing: 100k images
The ImageNet Challenge Story …

The ImageNet Challenge Story … strong supervision

The ImageNet Challenge Story … outcomes
Strong supervision:
• Features from networks trained on ImageNet can be used for other visual tasks, e.g.
detection, segmentation, action recognition, fine grained visual classification
• To some extent, any visual task can be solved now by:
1. Construct a large-scale dataset labelled for that task
2. Specify a training loss and neural network architecture
3. Train the network and deploy
• Are there alternatives to strong supervision for training? Self-Supervised learning ….

1. Expense of producing a new dataset for each new task
2. Some areas are supervision-starved, e.g. medical data, where it is hard to obtain
annotation
3. Untapped/availability of vast numbers of unlabelled images/videos
– Facebook: one billion images uploaded per day
– 300 hours of video are uploaded to YouTube every minute
4. How infants may learn …
Why Self-Supervision?
剩余140页未读,继续阅读


















安全验证
文档复制为VIP权益,开通VIP直接复制

评论0