Challenge-Response: Data Science and AI in Production

9:10–9:50 a.m.

Time to market, regulatory compliance, talent shortages, and new opportunities are fueling innovation in financial services. From text analytics to automated development workflows for compliance, customers are using MATLAB® to respond with production applications. Deep learning and NLP in investment, risk model governance, and ESG examples will be discussed.

David Rich

David Rich, MathWorks

Applied Uses of AI for Investment Insights and Operational Efficiency

9:50-10:30 a.m.

With the dawn and rapid evolution of artificial intelligence and machine learning, financial institutions waste no time innovating with the technologies and techniques. This is particularly true in buyside investing with significant elements of quantitative discipline, including JPMorgan Asset Management (JPMAM).

The breadth of the innovations touches a wide spectrum of investment processes, including but not limited to robust technology operation with pre-emptive issue identification, innovative and efficient product construction for passive strategies, and forward-looking detection of market rotations, as well as systematic prediction of alpha opportunities for active businesses.

In this talk, David will discuss thematically the applied uses of AI and ML technology that help generating operation efficiencies and investible insights for JPMAM clients. This discussion will feature a brief demo live in visualizing the techniques deployed.

David Lin

David Lin, JP Morgan Asset Management

The R-Factor: Converting ESG and Corporate Governance Data into Investable Insights

10:30-11:10 a.m.

Availability of financially material, consistently reported, and comparable ESG data is one of the biggest challenges facing investors across the capital markets. That’s why State Street Global Advisors built the R-Factor, a scoring model that leverages multiple data sources and aligns them to widely accepted, transparent materiality frameworks to generate a unique ESG score for 5,500 global companies. They draw on the materiality framework of the Sustainability Accounting Standards Board (SASB) and national/investor-created corporate governance frameworks. R-Factor was built to solve for the data quality challenges in the market, and to remove opaqueness around ESG materiality in the scoring process. It is the only score that is backed by a strong stewardship commitment from an asset manager and is designed to put companies in the driver seat to help create sustainable markets.

Todd Bridges

Todd Bridges, State Street Global Advisors

Dynamic Replication and Hedging: A Reinforcement Learning Approach

11:20-12:00 p.m.

In this talk, we address the problem of how to optimally hedge an options book in a practical setting, where trading decisions are discrete and trading costs can be nonlinear and difficult to model.

Based on reinforcement learning (RL), a well-established machine learning technique, we propose a model that is flexible, accurate, and very promising for real-world applications. A key strength of the RL approach is that it does not make any assumptions about the form of trading cost. RL learns the minimum variance hedge subject to whatever transaction cost function one provides. All it needs is a good simulator in which transaction costs and options prices are simulated accurately.

This is joint work with Gordon Ritter.

Published paper: Deep Reinforcement Learning for Option Replication and Hedging

Petter Kolm, New York University

CCAR Neural Networks Model

1:10-1:50 p.m.

Neural network (NN) models represent an opportunity to improve the credit loss forecasting and stress testing in Comprehensive Capital Analysis and Review (CCAR), which can be estimated as a function of macroeconomic variables using ARIMA-type models. For reference, please refer to: Capital Planning at Large Bank Holding Companies: Supervisory Expectations and Ranges of Current Best Practices, 2013. However, there remain challenges for the application of NN models such as model interpretability per regulatory requirements and a tendency to overfitting. This session will discuss this research and highlight the following:

  1. By leveraging a credit card firm’s monthly write-off data for over 15 years, a parsimonious NN model can be developed, which outperforms the traditional regression model with ARIMA errors in Mean Squares Error (MSE).
  2. Two macroeconomic variables with lags are selected from a pool of 500 by the combination of LASSO and Stepwise regression algorithms, which enables the NN model to be interpretable for the CCAR scenario narratives. The sign or direction of estimated input weights should be consistent with or constrained by business intuition.
  3. This study also found that the NN model could be vulnerable to overfitting. The stress testing is sensitive to the design of network architect.
Heng Chen

Heng Chen, HSBC and Northwestern University

How MATLAB Reshapes the Landscape to Serve Wealth Management Clients

1:50-2:30 p.m.

Wealth management is an integral function of financial institutions that provides financial and investment advisory services to high-net worth clients. Thousands of clients with a unique financial need require a powerful analytic platform to perform tasks such as asset allocation, manager selection, and custom report generation. MATLAB has been used to accommodate multi-tread calculations throughout the entire process, from data preprocessing to financial modeling to report generating. As a result, a stable recommendation to clients with sophisticated nonlinear constraints can be generated in timely manner.

Stephanie Wang

Stephanie Wang, Morgan Stanley Wealth Management

Applications of Academic Theory and Quant Techniques in Securities Lending

2:50-3:30 p.m.

Like many other markets, Securities Lending has seen a significant increase in digitization and electronification. This has increased market complexity and the need for more systematic decision making. As a result, Securities Finance and State Street Associates have leveraged quantitative techniques to build intelligent pricing algorithms and quantitative models to capture market price pressures. In this presentation we will discuss our approach to quantitative modeling, examples of solutions we built and their applications.

Yasser El Hamoumi

Yasser El Hamoumi, State Street Global Markets

Travis Whitmore

Travis Whitmore, State Street Global Markets

Penn Wharton Budget Model: Macroeconomics in MATLAB

3:30-4:10 p.m.

The Penn Wharton Budget Model (PWBM) is an integrated model of the U.S. economy. It consists of multiple components: a demographics microsimulation, tax modules, Social Security and other government programs, and a dynamic macro-economy model. The dynamic model computes optimal decisions by heterogeneous, rational, forward-looking agents and finds the aggregate effect on prices such that agents’ actions and prices are in equilibrium. These types of models are computationally intensive and are the workhorse models used by modern computational macro-economists. PWBM uses the dynamic model, built in MATLAB, to project the effect of policy changes on U.S. household behavior and the consequences for macro-economic variables such as GDP, interest rates, debt, and capital formation.

Efraim Berkovich

Efraim Berkovich, University of Pennsylvania

Model Risk Management for Alpha Strategies Created with Deep Learning

4:10-4:50 p.m.

In this session, you’ll learn about:

  • Understanding the challenges of using deep learning to build alpha generation strategies
  • Model risk management to detect when machine learning strategies are not performing as intended
  • Can you model a constantly moving market? When DL should (and should not) be used
Ben Steiner

Ben Steiner, BNP Paribas Asset Management

A Master Class in Building Production-Grade NLP Pipelines

11:20-12:00 p.m.

Building and deploying NLP applications involves multiple steps, including data ingesting, pre-processing, labeling, model building, model selection, and deployment.

While data scientists are typically involved in prototyping end-to-end applications, deploying robust NLP applications in production requires building enterprise-grade pipelines and designing each stage in the pipeline to accomplish a particular task. This workshop presents QUSandbox, an enterprise platform for prototyping, designing, and scaling production-worthy machine learning pipelines. The platform and language agnostic platform enables integrating multiple tools to design coherent and production-grade pipelines that are auditable, replicable, and scalable. This master class will demonstrate the use of natural language processing techniques to analyze EDGAR call earnings transcripts that could be used to generate sentiment analysis scores using the Amazon Comprehend, IBM Watson, Google, and Azure APIs (application programming interfaces) to train your own model built in MATLAB. We will then illustrate how the various steps can be streamlined through a QuSandbox pipeline to enable building scalable machine learning applications in production.

Sri Krishnamurthy

Sri Krishnamurthy, QuantUniversity

A Platform for Risk Models

1:10–2:30 p.m.

See how MATLAB® is used to drive the development, review, deployment, and monitoring stages of credit and market risk models, using the latest advances in artificial intelligence and Live Editor technology to meet regulatory standards of traceability and reproducibility.

  • Technology advances to guarantee model results can be reproduced by validation teams and in production
  • Explain and interpret the output of artificial intelligence models to stakeholders and regulators
  • Improve consistency of model development, validation, and documentation activities through automation
Paul Peeling

Paul Peeling, MathWorks

AI, Machine, Deep Learning, and NLP in Enterprise Investment and Risk Management

2:50-4:10 p.m.

With the dreams of artificial intelligence, we have come a long way since the 1950s. Predictive modeling and machine and deep learning have started to permeate every aspect of finance. Two hot areas for adoption of these techniques is model validation and stress testing. With the onslaught of Twitter increasing the volatility in our markets, newer companies are exploding on the scene with novel risk management systems that utilize Twitter in predicting volatility. Through demos, we will be exploring the state-of-the-art in artificial intelligence to generate more insights into how to best to manage portfolios in this time of fake news and volatility.

Key areas of focus:

  • Introduction to the state of Predictive Modeling, AI, Machine Learning, and Deep Learning in Risk and Finance
  • Demos to illustrate natural language processing
  • Discussions around scaling to production
  • Discussions around model governance
Marshall Alphonso

Marshall Alphonso, MathWorks

Natural Language Processing and Deep Learning in Finance

4:10-4:50 p.m.

In 2010, the worldwide IP traffic exceeded 20 exabytes (20 billion gigabytes) per month. With the explosion of unstructured information came a massive demand for computational power, efficiency, and explainable AI. Even with the explosion of quants (approx. 10,000 worldwide) we are still massively short in our ability to translate the data into meaningful decision-making information that moves businesses. So, rather than just allowing for humans to interact with the data through the lens of a quant, natural language processing allows us to interact with some of humans’ most practiced skills: spoken and written word.

Natural language processing (NLP) refers to the broad class of computational techniques for incorporating speech and text data, along with other types of financial data, into the development of smart systems. MathWorks NLP systems are currently being implemented at various financial institutions worldwide. The work is being done using algorithms and visualizing capabilities built into our Text Analytics Toolbox™. Many of these algorithms are complemented by using the Deep Learning Toolbox™ and Statistics and Machine Learning Toolbox™.

Financial applications that are using NLP:

  • Text analytics for preparing data for an NLP system (entity modeling, cleaning data)
  • Volatility modeling (GARCH, VARMA) and risk analytics (market and credit) using text
  • Sentiment modeling to inform discover new alpha opportunities and beat the market
  • Sentiment indicators for attribution of portfolio movements
  • Economics research indicator development using state-space models such as a Kalman filter
  • Litigation modeling: Work with complex legal languages to audit firms
  • Fund research: Modeling audits of firms to prioritize investment opportunities and flag risks
Gary Kazantsev

Gary Kazantsev, Bloomberg Head of Quant Technology

Marshall Alphonso

Marshall Alphonso, MathWorks

Text Analytics and Sentiment Analysis

11:20–12:00 p.m.

Natural language processing (NLP) is a rapidly growing area of interest in the financial services industry as quants, risk managers, and financial analysts are all interested in deriving new alpha and insights from speech and text data. In this session, you will learn how MATLAB® as a platform assists in common techniques used in text analytics, ranging from preprocessing your text data to using machine learning to model that data. Specific techniques covered in this session include:

  • Text analytics for preparing data for an NLP system (entity modeling, cleaning data)
  • Natural language processing: LSA, LDA, and word embeddings
  • Sentiment modeling to discover new alpha opportunities and beat the market
Alex Link

Alex Link, MathWorks

Software Development with MATLAB

1:10-1:50 p.m.

MATLAB® is often used in financial services as a modeling tool. And with any tool, as the size and complexity of your application increases, it becomes more challenging to manage your development process.

If you want to develop reusable and reliable MATLAB code, collaborate with a large team, and/or build user interfaces around your models to present them to business users, you should be at this session.


  • Structuring large projects with Source Control and MATLAB Projects
  • Error handling and readying code for production
  • Unit testing and behavior-driven development
  • Dashboard development to share models
Siddharth Sundar

Siddharth Sundar, MathWorks

Integrating Python and MATLAB

2:50-4:10 p.m.

Financial engineers who rely only on Python® may find themselves encountering challenging tasks when it comes to C/CUDA code generation, building interactive dashboards, parallelizing applications, signal and image processing, computer vision, portfolio/risk management, and deep learning. Contrarily, MATLAB® is a full-stack advanced analytics platform that empowers domain experts to rapidly prototype ideas, validate models, and push the applications into production with ease.

However, sometimes it is advantageous to integrate MATLAB and Python to build on top of open-source libraries and pipe data between different IT systems or the web.

In this session, we demonstrate the many ways in which MATLAB and Python can integrate to give business users and decision makers immediate access to many of MATLAB’s built-in analytics capabilities.

Highlights include:

  • Calling Python libraries directly from MATLAB
  • Using MATLAB to rapidly build machine learning models for trading strategies
  • Calling a live MATLAB session from Python
  • Packaging MATLAB analytics as royalty free .py libraries
  • Scaling hybrid MATLAB/Python applications via the MATLAB Production Server™
Ian McKenna

Ian McKenna, MathWorks