Abstract: Nearly two decades ago, prominent statistician Leo Breiman wrote an influential paper titled “Statistical Modeling: The Two Cultures”  where he presented fundamental ideas of generative models (statistical) and algorithmic models in studying complex datasets. Breiman urged statisticians to adopt a more diverse set of tools for data modeling. As of 2019, the algorithmic model has witnessed a rapid development for more than 20 years, and this trend will be continuing for years to come. Nowadays, it is usually referred to as machine learning, or in particular, neural network, deep learning, etcetera. The good news is that with substantial investments from leading technology companies like Google, Facebook, Uber, the Breiman’s goal of diversifying data modeling tools can be pushed forward even further: the infrastructure now supports getting the best of the two worlds, namely, researchers, practitioners are able to flexibly incorporate various components of both generative model and algorithmic model into a hybrid framework which allows to prioritize between predictive power or interpretability of the models. In this paper, first, the two cultures are briefly surveyed. Second, basic ideas of integration techniques are discussed together with supporting programming languages. Third, some exemplary implementations of integration with concrete applications are presented. Finally, certain recent relevant work in business and finance are introduced.