Unconscious Bias
A great workflow for investigating unconscious bias in predictive models, and a case study in R using the C5.0 package, by @DynamicWebPaige https://t.co/v9YdpiALGS #rstats
— David Smith (@revodavid) June 14, 2018
Pre-trained Transformer in PyTorch (NLP)
Reference: Day 23 NLP Transfer Learning / OpenAI
I made a @pytorch implementation of @openai's pretrained transformer with a script to import OpenAI's pre-trained model.
— Thomas Wolf (@Thom_Wolf) June 14, 2018
Link: https://t.co/6zY8NavPA3
Thanks @AlecRad, @karthik_r_n, @TimSalimans, @ilyasut for open-sourcing the code right away!
Generative Query Network
The Generative Query Network, published today in @ScienceMagazine, learns without human supervision to (1) describe scene elements abstractly, and (2) 'imagine' unobserved parts of the scene by rendering from any camera angle. @arkitus @DeepSpiker
— DeepMind (@DeepMindAI) June 14, 2018
Blog: https://t.co/mD8RR7CA4M pic.twitter.com/HxcqFAYhcz
A new machine-learning system can "observe" a scene and predict how it would look from any perspective with just a few photos. Read the research: https://t.co/l47S13yh3y pic.twitter.com/iMw0Z9TdZe
— Science Magazine (@sciencemagazine) June 14, 2018
Our work of 1.5+ years, Neural Scene Representation and Rendering, is finally out in @sciencemagazine, https://t.co/7g6vd9hVyQ … https://t.co/BkltSJ1ko2 … @arkitus #unsupervisedlearning #thecake #generativemodels #RL #inference #uncertainty
— Danilo J. Rezende (@DeepSpiker) June 14, 2018
Neural Video Games: distilling video games with Neural Networks so accurately that you can play as if you were playing the real thing in real time. Congrats @arkitus @DeepSpiker & al! #NeuralVideoGames https://t.co/p69903toOU pic.twitter.com/DiClh8GBTy
— Oriol Vinyals (@OriolVinyalsML) June 14, 2018
Great work. The approach allows you to learn compact representations of high-dimensional states (like 3D scenes) by training on lower-dimensional query-response pairs (like 2D images). This can be extended to other domains where complex states are hard to autoencode. https://t.co/rUXuEgafoy
— Denny Britz (@dennybritz) June 15, 2018
Really cool work from the “dream team” @DeepMindAI. Generative models of environments might pave the way toward machines that autonomously learn to understand the world around them. https://t.co/qmLsytuyEQ
— hardmaru (@hardmaru) June 14, 2018
10k Layer Vanilla CNN
“How to Train 10,000-Layer Vanilla Convolutional Neural Networks?” https://t.co/6FWwZ1p68I
— hardmaru (@hardmaru) June 15, 2018
Ever wanted to train a 10k layer vanilla conv net? Curious why gating helps RNNs train? Super excited about our pair of ICML papers!! https://t.co/sdLx1f64am https://t.co/CV0Y2kBmNP. Really fun collaboration with @Locchiu, Minmin, @yasamanbb, @jaschasd, & Jeffrey. pic.twitter.com/YcN6NTVIjX
— Sam Schoenholz (@sschoenholz) June 15, 2018
Visualization
Fun fact for the #dataviz crowd:
— John Burn-Murdoch (@jburnmurdoch) June 13, 2018
This chart (and the ones in the story) are the first we've done 100% in ggplot, right down to the custom @FT font and the white bar in the top left. https://t.co/BVFmoYX2WL
Notable Research
I want to flag this fascinating arxiv (by friends incl. @andrea_e_martin), showing a neural net architecture that can both learn _and_ generalise. If you follow @GaryMarcus then you'll know that deep nets don't generalise, so this seems a real advance. https://t.co/aZvZWARWQF
— Hugh Rabagliati (@hugh_rab) June 14, 2018
Differentiable Compositional Kernel Learning for Gaussian Processes #ICML 2018
— ML Review (@ml_review) June 15, 2018
By @ssydasheng @Guodzhhttps://t.co/aBisxDTp2h pic.twitter.com/AhPa71rRhm
'A Probabilistic U-net for Segmentation of Ambiguous Images': a cool new paper by my colleagues at DeepMind on how to deal with uncertainty in segmentation models. https://t.co/PIVQ1CAOuh #DeepLearning #MachineLearning pic.twitter.com/6JJr68mZzD
— Nenad Tomasev (@weballergy) June 14, 2018
Tutorials and Resources
Explore the SmoothGrad feature saliency technique in an Observable notebook with TensorFlow.js by @aman_gif.
— TensorFlow (@TensorFlow) June 14, 2018
Learn more here → https://t.co/rKbJ57yRH0 pic.twitter.com/bnJnzgR5S9
The Github repository for Grokking Deep Learning now has all code contained within the book - many thanks to @AmberLeighTrask!https://t.co/3KEfLNjvj5
— Trask (@iamtrask) June 13, 2018
Video from my talk at @JailbreakBrewCo on using machine learning to help augment bug discovery https://t.co/iXhIfe0IUJ
— Sophia d’Antoine (@Calaquendi44) June 13, 2018
Paper & #PyTorch code for 2nd place in #CVPR2018 DeepGlobe Building Extraction Challenge:
— Alexandr Kalinin (@alxndrkalinin) June 14, 2018
TernausNetV2: Fully Convolutional Network for Instance Segmentation https://t.co/RjI6bIMPvV
- WideResnet-38 encoder
- 11 input channels
- In-Place Activated BatchNorm
- watershed transform pic.twitter.com/DEbyxHoBhZ
A few weeks ago, @mbostock wrote a tweet. I wanted to understand what he means. So I wrote a blog post about it: https://t.co/Uw22U6smYr
— Lisa Charlotte Rost (@lisacrost) June 14, 2018
rstats
#rstats ggplot2 tip: I use low alpha so I can draw hundreds of overlapping lines to display uncertainty. But it breaks the color guide (left). Adding `guides(color = guide_legend(override.aes = list(alpha = 1)))` to plot overrides the transparency in the guide (right) pic.twitter.com/dh3WiO6O36
— tj mahr 🍕🍍 (@tjmahr) June 14, 2018
Introducing dbx: a fast, easy-to-use database library for R. High performance batch operations, upserts, and more 🔥 #rstats https://t.co/FrHhgoUQw7 pic.twitter.com/QYCxjyeGlF
— Andrew Kane (@andrewkane) June 14, 2018
Classic, 😻 deck 📽 w/ code:
— Mara Averick (@dataandme) June 14, 2018
"A Gentle Introduction to Network Visualisation" by @_ColinFay
https://t.co/bO6KlRjBBf #rstats #dataviz #ggraph pic.twitter.com/Nol89gj9f3
🤝 welcome to @drsimonj inc. – your first assignment:
— Mara Averick (@dataandme) June 15, 2018
"Creating corporate colour palettes for ggplot2" https://t.co/e0GkYu6Yta #rstats #dataviz #ggplot2 pic.twitter.com/5Y6f1bDTDY
Miscellaneous
Questions to ask about software using machine learning https://t.co/dYnGt6OihT pic.twitter.com/0lMbweWXLQ
— Rachel Thomas (@math_rachel) June 14, 2018
#DataScience has taught me to always aspire to be merely average.
— Nihilist Data Scientist (@nihilist_ds) June 15, 2018
That way I will consistently match everyone’s expectations.#statistics
Feature engineering is often the way to most improve the performance of your ML system. Domain experts make the best feature engineers. https://t.co/QqsyOL7vUO
— Brandon Rohrer (@_brohrer_) June 14, 2018
Yes! Machine learning product design boils down to:
— Drew Breunig (@dbreunig) June 15, 2018
1. Having unique data
2. Asking the right question for your users
3. Figuring how to get more interesting data to ask better questions https://t.co/dqVpKzbWlw
Now that @chrmanning has called me out, I will take my duty seriously and tweet more to stave off the impending AI Winter. 😛 https://t.co/A7R0d2REPD
— Andrew Ng (@AndrewYNg) June 14, 2018
Twitter’s ML platform is now based on #TensorFlow. As usual, it came down to deployment. https://t.co/oLjG26N2MT pic.twitter.com/K7UrU6Em5C
— Delip Rao (@deliprao) June 14, 2018