By Alisher Abdulkhaev Issue #17: November 2020 AlphaFold: a solution to a 50-year-old grand challenge in biology The latest version of AlphaFold (AlphaFold-2) has been recognised as a solution to one of biology's grand challenges - the “protein folding problem”.It was validated at CASP14, the biennial Critical Assessment of protein Structure PredictionWe’re excited about the potential impact … Continue reading AI DIGEST #NOVEMBER20
ML
Week VII: Seaborn & Pandas-Profiling
Following our Udemy tutorial, we realized that that was only one way to go. As soon as we hit some small problems, things started to get complicated. Online tutorials give you a dense but superficial overview, where everything from code to dataset is neatly prepared. But real word stuff is messy rather than neat. We … Continue reading Week VII: Seaborn & Pandas-Profiling
Week VI: Clustering
After a great weekend with our study group we've finished K-Means Clustering and Hierarchical Clustering. In Classification, your model tries to predict two or more labels, that you already know of. (e.g. it learns from past customer data if a future customer is going to buy your product or not) Clustering is explorative. You don't … Continue reading Week VI: Clustering
Week VI: A Sinking Ship
We're done with Regression and Classification and finished about 40% of the Udemy course. I was a bit skeptical to be honest, because up until now everything went perfectly and smoothly, having templates, clear guidelines and a rather neat and small datasets. But the real world isn't smooth and perfect. It's complicated, confusing and flawed. … Continue reading Week VI: A Sinking Ship
Week V: Math
2 days. 2 very intense days. 20 hours of building machine learning models. and there is no way on earth I could write a consistent and coherent blog post about that in my current state. However, after finishing Regression and starting with Classification (Logistic Regression, SVM and Kernel SVM), that involved some heavy math, I … Continue reading Week V: Math
MLTOKYO: Week IV
We're done with week IV and Linear Regression, Multiple Linear Regression and Polynomial Regression. The most important lesson learned this time: set the right features (MLR). Meaning, the ones that actually have an impact. Explaining Backward Elimination, Forward Selection or Bidirectional Elimination was quite abstract and we couldn't wrap our head around it until we … Continue reading MLTOKYO: Week IV