What do Machine Learning Practitioner Actually Do
my new post: What do machine learning practitioners actually do? https://t.co/fNOKVJGISr
— Rachel Thomas (@math_rachel) July 12, 2018
"Is cleaning data really part of ML? Yes. [..] The process of cleaning a data set and training a model are usually interwoven: I frequently find issues in the model training that cause me to go back and change the pre-processing for the input data." https://t.co/GenYS5Sif6
— Monica Rogati (@mrogati) July 12, 2018
Big News for Python
Everyone gets burned out, even @gvanrossum. https://t.co/bwwJuNd0Bk
— Kenneth Reitz (@kennethreitz) July 12, 2018
whoa. 😳 pic.twitter.com/iVteqltNtn
— 👩💻 DynamicWebPaige @ SciPy2018 🔬🐍 (@DynamicWebPaige) July 12, 2018
As Python becomes the most popular programming language out there, its creator Guido van Rossum is taking a "permanent vacation" from his role of Benevolent Dictator for Life https://t.co/RYQ5Ld0oj5
— Parker Higgins (@xor) July 12, 2018
Research
UMAP
umap: Uniform Manifold Approximation and Projection for dimensionality reduction. #Python #DataScience #MachineLearning
— Randy Olson (@randal_olson) July 13, 2018
Claims to have several advantages over tSNE.https://t.co/0N06f8ktcn pic.twitter.com/Bh5ogX6dTQ
UMAP version 0.3 is now available. You can now add new data to an existing embedding, embed using labelled data, or use both features for metric learning. Documentation is on readthedocs: https://t.co/ZFaOHrPti4. pic.twitter.com/CqiuZFbmCQ
— Leland McInnes (@leland_mcinnes) July 12, 2018
Slides for my @SciPyConf talk this morning on UMAP can be found here: https://t.co/XsZVuRxb5a
— Leland McInnes (@leland_mcinnes) July 12, 2018
Universal Transformers
Universal Transformers propose to augment Transformers with Recurrence in depth and Adaptive Computation Time. This model outperforms Vanilla Transformers in MT / bAbI / LA / LTE.
— Oriol Vinyals (@OriolVinyalsML) July 12, 2018
Paper: https://t.co/U2YAeuO6EO
Code: Soon in https://t.co/KSuQAkn5Jh pic.twitter.com/lCKfsEAswG
#UseR2018
R for Psychological Science?
Back in Sydney after a great trip to Brisbane. Just wanted to say thank you to everyone at #useR2018, and especially to @visnut for inviting me! The slides for my talk are here:https://t.co/Q6BMC5ZLXT
— Danielle Navarro (@djnavarro) July 13, 2018
Keynote
Roger Peng’s #useR2018 keynote this morning resonates with me, as another long time user/developer/instructor. Useful, opinionated take on where we are now in #rstats and how we got here. @rdpeng https://t.co/bOLSoaFupd pic.twitter.com/ejc9yFYGVA
— Jenny Bryan (@JennyBryan) July 13, 2018
Here is the video for my keynote from #useR2018 on teaching R to new users. https://t.co/KUrG097D7D
— Roger D. Peng (@rdpeng) July 15, 2018
🖤ed @rdpeng's #useR2018 keynote? Pairs nicely with 👇
— Mara Averick (@dataandme) July 13, 2018
🎬 "John Chambers' Keynote Speech from useR! 2014"https://t.co/TIaSFoGBRj via @wwwDSLA #rstats pic.twitter.com/ytV4pvERP7
fasster
📽 #useR2018 sneak-peak:
— Mara Averick (@dataandme) July 12, 2018
⚡️ "fasster: Forecasting multiple seasonality w/ state switching" by @mitchoharawildhttps://t.co/TL0rc18f0v #rstats pic.twitter.com/TNdMuwzeVy
#ICML2018
Holy. Shit. Expressive, interesting, and lengthy piano music generations. I genuinely got goosebumps listening to these. As far as I'm concerned, this is a huge step forward for creative ML. Cheers Anna Huang. Paper: https://t.co/D50bdGHhxO #icml2018 pic.twitter.com/WjECu2h3Pl
— James Owers (@jamesowers) July 14, 2018
Overcoming Catastrophic Forgetting
Overcoming Catastrophic Forgetting with Hard Attention to the Task #ICML2018
— ML Review (@ml_review) July 15, 2018
By @serrjoa @Surisdi @nkundiushuti @alexk_z
Reducing cutting forgetting rates by 45 to 80%.
As a by-product hard attention masks enable weights pruning and compression.https://t.co/XGMTlYAGoj pic.twitter.com/4ISbGpzShi
Long-term Structure of Music
Hear some AI generated music at our talk and poster today at #ICML2018! Talk is at 2:50 in Victoria. Poster is #175 and 👇@jesseengel @colinraffel @fjord41 @douglas_eck pic.twitter.com/ufSesFVSTV
— Adam Roberts (@ada_rob) July 12, 2018
Image Transformer
Check out the Image Transformer by @nikiparmar09 @ashVaswani others and me. Talk at 3:20p @ Victoria (Deep Learning). Visit our poster at 6:15-9:00p @ Hall B #217! https://t.co/stuwPR1fGt pic.twitter.com/i86usHSZIV
— Dustin Tran (@dustinvtran) July 12, 2018
Black-Box Variational Inference
Love this paper, it restores your faith in humanity.
— Neil Lawrence (@lawrennd) July 12, 2018
Great challenge, great science, and brings together a number of ideas across different fields.
More of this type of work please! @dennisprangle #ICML2008 https://t.co/p4D6nx6TW0
Reproducible ML
Slides of my talk at #ICML2018 on reproducible #MachineLearning workshop https://t.co/8PCk5cZlnJ https://t.co/cH3PhRZz4K #RML2018 @scikit_learn #opensource #openscience pic.twitter.com/CU22oMgsoS
— Alexandre Gramfort (@agramfort) July 14, 2018
Noise Contrastive Priors
How do we specify priors for Bayesian neural networks? Check out our work on Noise Contrastive Priors at the ICML Deep Generative Models workshop 11:40am+. @danijarh, @alexirpan, Timothy Lillicrap, James Davidson https://t.co/OS3gmKin9g pic.twitter.com/cNhfeCYuNb
— Dustin Tran (@dustinvtran) July 15, 2018
Tutorials / Resources
Plant Seedling Classification
Plant seedling classification: a competition-winning approach using data augmentation and Keras https://t.co/uCFHmriodt
— Ben Hamner (@benhamner) July 13, 2018
Bayesian Data Science 2 Ways
Must-watch television: Bayesian Data Science Two Ways: Simulation and Probabilistic Programming, featuring @hugobowne and @ericmjl https://t.co/a6fVcLOOaz
— Chris Fonnesbeck (@fonnesbeck) July 12, 2018
Grokking Deep Learning
For anyone interested, I'm (mostly done) writing a book which teaches Deep Learning using intuitive examples more than math. All code examples are on Github, written from scratch in Numpy
— Trask (@iamtrask) July 12, 2018
If you think this is your learning style, you can download it here: https://t.co/6KdGghoqLC
Kaggle Solutions
All of the available winners' solution from our ML competitions in one meta-kernel! Thank you @sudalairajkumar https://t.co/w5Y47Xd7sA
— Kaggle (@kaggle) July 12, 2018
Seedbank #tutorials
Today we’re launching Seedbank, a place to discover interactive ML examples which you can run from your browser, no set-up required. Each example can be edited, extended, and adapted into your own project.
— TensorFlow (@TensorFlow) July 12, 2018
Read @mtyka's post for more info ↓ https://t.co/k1McWSm8PG
Tools
fuzzyjoin
Did you know there's a package to do fuzzy joins in R? See fuzzyjoin by @drob https://t.co/beBwrpqMNN#rstats h/t @jakekaupp pic.twitter.com/TVL3nHXHsf
— Sharon Machlis (@sharon000) July 12, 2018
Gensim 3.5.0
#Gensim 3.5.0 is out! 📚 MASSIVE documentation overhaul, consistent docstrings, new usage examples… (plus a bunch of new features, fixes and improvements). Enjoy~https://t.co/Lmch7ryVqw pic.twitter.com/ZQ14n4wev0
— Gensim (@gensim_py) July 6, 2018
Miscellaneous
Our group would like to make error messages easier for a user to act on. It’s hard, but developing a guide has been a productive first step. https://t.co/oTHQ2A8FLs https://t.co/tuZRMSI3a3
— Jenny Bryan (@JennyBryan) July 12, 2018