GPT-3, a Giant Step for Deep Learning and NLP
Can intelligence emerge simply by training a big enough language model using lots of data? OpenAI tries to do so, using 175 billion parameters.
Hi there, and welcome to my blog.
Here you'll find resources about data science, software engineering, and life. Well, maybe not life...
Hope you enjoy.
Can intelligence emerge simply by training a big enough language model using lots of data? OpenAI tries to do so, using 175 billion parameters.
Text generation using GPT-2 is quite easy, using the right tools. Learn how to do it, as well as how to fine-tune the model on your own dataset.
An unsupervised approach to digit classification and generation.
How to structure your TensorFlow graph like a software engineer.
How to apply your model to input it has never seen before.
Learn how node2vec works, and what kind of information it captures that word2vec doesn’t — includes case study.
A couple of months ago I embarked on a journey to build my personal brand as a data scientist, and I want to share how I did it with you.
Splitting your dataset to train-test sets can sometimes be more complicated than one might expect.
The weird (but cool) way to access a tensorflow model by mounting it into a filesystem.
Learn all the details needed to implement a variational autoencoder, code included.