wander data Interpreting 135 nights of sleep with data, anomaly detection, and time series Three things are certain in life: death, taxes, and sleeping. Here, we’ll talk about the latest.Every night*, us humans, after a long day of roaming this Earth, are
opinion On the sensationalism of artificial intelligence news When AI articles misinform and mislead.
wander data How (and why) I built an over-complicated data-driven system to remind me to drink water Data, Golang, Python, Android, Docker, gRPC, Firebase, Cloud, BigQuery. Oh my!
wander data 100 days of travels - a data recap Summarizing and recapping with data the first 100 days of my backpacking adventure.
object detection Object Detection in the City Taking my Android model on a tour through the city of Siem Reap.
artificial intelligence Sorry, but your cat or dog AI is damaging the world. Summarizing the “Green AI” paper
Fitbit Interpreting 270862 Fitbit footsteps using time series analysis with Prophet Learning how much I walk during my 24 days in Malaysia
wander data "What did I visit in Singapore?" - Using Python and R to visualize and summarize my Foursquare's Swarm check-ins Reliving my visit to Singapore through data
puerto rico The importance of social networks in the Puerto Rican protests How a country united under the same hashtag, #RickyRenuncia.
nlp What did Puerto Rico say after its governor resigned? A Twitter data analysis Interpreting tweets containing the hashtag #RickyRenunció using spaCy, Google Cloud, and NLP.The island of Puerto Rico and its people are currently making history. On July 13, the Puerto Rico’
wander data Learning an image's leading colors using k-means Intersecting data and photography to find my preferred colors
Articles Analyzing tweets from the controversial Pokemon-related #BringBackNationalDex hashtag with spaCy and Google Cloud Discovering the top part-of-speech terms, sentiment, and mentioned Pokemon with the NLP library spaCy and Google Cloud
opinion More focus and desire to finding answers, and less hype My view on why newcomers should focus on the basics and on the true meaning of working with data
Projects Reliving Avengers: Infinity War with spaCy and Natural Language Processing Discovering the top nouns, verbs, entities and text similarity within the spoken lines of Earth’s Mightiest Heroes
Projects Am I going to listen to Potato Salad today? Using data, statistics and machine learning to make sense of my obsession with this songPlease bear with me because this will sound crazy.There is this song by Tyler, The Creator, and A$AP Rocky, named Potato Salad, and I like it. My girlfriend
Articles On how I acknowledge human based bias and how to handle it A word on how we, data practitioners, should be more aware and attentive to our own biases(story originally posted on Medium)In the world of data science, we define bias as the phenomenon in which a system overgeneralize from its data and learn
Projects Building a League of Legends team recommender in Go How to build a recommender system from scratch, and deploy it in the cloudEver since the boom of machine learning started, many aspects of our online lives have become a target of it. In particular, one of its primary use cases — recommender systems — have
Spammers vs Data @ PyData Warsaw 2018 On November 19, 2018 I presented a talk regarding Antispam at PyData Warsaw 2018. Unlike last year's presentation, which was about a personal project, this year I decided I talked
Projects What was Puerto Rico Googling during and after Hurricane Maria? Hurricane Maria left most of Puerto Rico dark and nearly uncommunicated. Under such circumstances, I assumed that the country's Google Search patterns would change. In this article, I explore this.
Pikachu Detection System Talk @ PyData Amsterdam 2018 On May 26, 2018 I had the fantastic opportunity of presenting my Pikachu detection project at PyData Amsterdam 2018. The experience was a great one (to be honest it's been
Projects Detecting Pikachu in videos using TensorFlow Object Detection The second part of my Pikachu Detection project is about improving the model I previously trained during the first iteration of the project, and making said model able to detect