Delivered once every Week. No Spam Ever.

Issue - 183

Worthy Read

This blog series from Sheroy Marker cover the principles of CD of microservices. Get a practical guide on designing CD workflows for microservices, testing strategies, trunk based development, feature toggles and environment plans.

Welcome to a short course that will teach you how to write Python scripts that can take advantage of the processing power of multicore processors and large compute clusters. While this course is based on Python, the core ideas of functional programming and parallel functional programming are applicable to a wide range of languages. To follow this course you should already have a good basic understanding of Python, e.g. loops, functions, containers and classes. This course will rely on you understanding the material presented in my Beginning Python and Intermediate Python courses. This is a short course that will give you a taste of functional programming and how it can be used to write efficient parallel code. Please work through the course at your own pace. Python is best learned by using it, so please copy out and play with the examples provided, and also have a go at the exercises.
parallel programming

I just found an article about pipes in Python on and was reminded that I was toying with the exact same thing recently. Using a lot of functional languages (mainly Haskell, Clojure, Elixir) and also a fair bit of bash, I am very used to streaming data through chains of functions using pipe-like constructs. Python does make this quite difficult and encourages a more imperative approach with intermediate variables
functional programming

It's easy to find investment advice. It's a little less easy to find good investment advice, but still pretty easy. We are awash in advice on saving for retirement, with hundreds of books and hundreds of thousands of articles written on the subject. It is studied relentlessly, and the general consensus is that it's best to start early, make regular contributions, stick it all in low-fee index funds, and ignore it. I'm not going to dispute that, but I do want to better understand why it works so well. As programmers we don't have to simply take these studies at their word. The data is readily available, and we can explore retirement savings strategies ourselves by writing models in code. Let's take a look at how to build up a model in Python to see how much we can save over the course of a career.

mypyc will compile type-annotated Python code to an optimized C. The first goal is to compile mypy with it to make it faster, so I hope that the project will be completed. Essentially, mypyc will be similar to Cython, but mypyc is a *subset of Python*, not a superset. Interfacing with C libraries can be easily achieved with cffi. Being a strict subset of Python means that mypyc code will execute just fine in PyPy. They can even apply some optimizations to it eventually, as it has a strict and static type system.

In the process of trying to build a vanilla python HTTP server, I realized that I don't know much about it's inner workings. So while I was stuck learning about sockets, TCP handshakes and protocols, I decided to tackle something that was a little more within my reach thus was born this humble little project. A simple python chat server meant to be used by terminal clients through netcat.
toy application

A beginner’s guide to understanding the inner workings of Deep Learning. .
deep learning

Prophet is an open-source Python package for time-series forecasting, originally developed by Facebook’s Data Science team to predict usage on different parts of Facebook. Prophet’s forte is forecasting highly seasonal data with long-term, non-stationary trends, punctuated with occasional spikes on specific dates.

In this post, we will talk about how one of Airflow’s principles, of being ‘Dynamic’, offers configuration-as-code as a powerful construct to automate workflow generation. We’ll also talk about how that helped us use Airflow to power DISHA, a national data platform where Indian MPs and MLAs monitor the progress of 42 national level schemes. In the end, we will discuss briefly some of our reflections from the project on today’s public data technology.

Openshift with Docker Images is the ultimate tool you need for automated deployment.


asciify - 143 Stars, 8 Fork
Convert any image into ASCII Art.

yacs - 47 Stars, 1 Fork
YACS -- Yet Another Configuration System

Face-Track-Detect-Extract - 45 Stars, 14 Fork
Detect , track and extract the optimal face in multi-target faces (exclude side face and select the optimal face).

voice_zaloai - 23 Stars, 5 Fork
dentifying gender and regional accent from speech

dropbox_ext4 - 19 Stars, 3 Fork
Hack to make Dropbox work on non-ext4 filesystems

dove - 18 Stars, 0 Fork
A command line utility to help manage your development server in Digital Ocean

flask-executor - 16 Stars, 2 Fork
A simple Flask wrapper for concurrent.futures

sales_forecast_ml - 12 Stars, 5 Fork
A web application to predict the sales of a newly launched product

SceneGraphParser - 11 Stars, 0 Fork
A python toolkit for parsing sentences (in natural language) into scene graphs (as symbolic representation).

Face recognition using open-cv and machine learning written in python

pavlova - 5 Stars, 0 Fork
A python deserialisation library built on top of dataclasses

XMeans - 4 Stars, 0 Fork
Implementation of X-means clustering algorithm

IStuPydKernel - 4 Stars, 0 Fork
StuPyd kernel for Jupyter