Polynomial Regression
(As an alternative to neural nets)
Linear regressions with 2nd/3rd order polynomials were equal to or better than neural-nets (aka 'deep learning') in every data set they looked at. https://t.co/e0DTbtaKDa
— David Baranger (@DABaranger) June 23, 2018
Removing Raindrops
Automatically removing raindrops from an image. Useful for self-driving cars. Trains an attentive generative network using adversarial training. Learns about raindrop regions & their surroundings, then generates focused on surroundings. https://t.co/y423rQkMdk pic.twitter.com/e6FgxXyBSl
— Reza Zadeh (@Reza_Zadeh) June 23, 2018
Procedural Content Generation
Our survey paper on Procedural Content Generation via Machine Learning (PCGML) is now officially published. We survey the nascent field of using ML to generate game content.
— Julian Togelius (@togelius) June 13, 2018
Available as early access on IEEE Xplore:https://t.co/AhfLbbUI4J
And on ArXiv:https://t.co/riV6eYuCZ2 pic.twitter.com/xVno9TmS0b
GrCAN
GrCAN: Gradient Boost Convolutional Autoencoder with Neural Decision Forest. https://t.co/2eqxqwDfDX pic.twitter.com/XN7hWfLjPa
— arxiv (@arxiv_org) June 23, 2018
Collaborative Intelligence
Human-AI Collaborations in 1,500 companies: training, explaining, interacting, and more: https://t.co/1JLaaOqoSE by @pauldaugh
— Oren Etzioni (@etzioni) June 23, 2018
Data Science vs. Statistics
"Data science vs. statistics: two cultures?" A good review article collecting & synthesizing all the diff opinions over the years. If someone should still ask what data science is about, that's the article to refer them to https://t.co/TPy5UEHTfI
— Sebastian Raschka (@rasbt) June 23, 2018
Also has a good set of references on when/how the term was coined: pic.twitter.com/fqssPaOb28
— Sebastian Raschka (@rasbt) June 23, 2018
Tutorials
This post about missing values in #julia gives a good overview of the “software engineer’s null” vs. the “data scientist’s null” and why it matters. ht @clarkfitzg https://t.co/AeDqym7Z6R
— Jenny Bryan (@JennyBryan) June 23, 2018
My Do More With R series of short screencasts so far:
— Sharon Machlis (@sharon000) June 23, 2018
* Interactive scatter plots
* dplyr's case_when
* test code with testthat
* easy #rstats dashboards
* @RStudio code snippetshttps://t.co/Ex2LlfjNwF#R4DS pic.twitter.com/wzNxW6m7X4
Recent Advances in Variational Inference
A nice review article on recent advances in variational inference (C. Zhang, J. Butepage, H. Kjellstrom, S. Mandt): https://t.co/SLVVPAmchE #MachineLearning
— Diana Cai (@dianarycai) June 23, 2018
Small n Correlations + p Values
New post:
— Guillaume Rousselet (@robustgar) June 22, 2018
Small n correlations + p values = disasterhttps://t.co/KBgRHGCvLg pic.twitter.com/mBe2oRb3Tk
Miscellaneous
Dissolving the Fermi Paradox: why taking uncertainties into account makes the absence of aliens less weird, and why an empty sky doesn't foretell our doom. https://t.co/7C6g14Spbd Popular FAQ: https://t.co/3a6piKhAmi
— Anders Sandberg (@anderssandberg) June 23, 2018
Deep learning is useful because it enables us to create programs that we could not otherwise code by hand. But the space of programs you can learn via deep learning models is a minuscule slice of the space of programs that we may be interested in.
— François Chollet (@fchollet) June 22, 2018