Extreme correlation

Extreme correlation

Shengyu ZHENG

In this article, Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024) explains the concept of extreme correlation.

Background

In financial risk management, there is a concept that is often overlooked, the extreme correlation also known as tail dependence. Tail dependence reveals how extreme events in two variables are linked. The oversight could leave portfolios exposed to amplified risks during market turbulence. In this post, we will get to see the definition and implications of this concept.

Linear correlation and copula

As presented in the post on copula, using linear correlation to model the dependence structure between random variables poses many limitations, and copula is a more generalized tool that allows to capture a fuller picture of the dependence structure.

Let’s recall the definition of copula. A copula, denoted typically as C∶[0,1]d→[0,1] , is a multivariate distribution function whose marginals are uniformly distributed on the unit interval. The parameter d is the number of variables. For a set of random variables U1, …, Ud with cumulative distribution functions F1, …, Fd, the copula function C satisfies:

C(F1(u1),…,Fd(ud)) = ℙ(U1≤u1,…,Ud≤ud)

Here we introduce Student t-copula as an example, which will also be used as an illustration in the part of extreme correlation.

Tail dependence coefficient

The tail dependence coefficient captures the dependence level of a bivariate distribution at its tails. Let’s denote X and Y as two continuous random variables with continuous distribution F and G respectively. The (upper) tail dependence coefficient between X and Y is defined as:

with the limit of λU∈[0,1]

We can conclude that the tail dependence coefficient between two continuous random variables is a copula property, and it remains invariant with strict increasing transformations of the two random variables.

If λU∈(0,1], X and Y are considered asymptotically dependent in their (upper) tail. If λU=0, X and Y are considered asymptotically independent in their (upper) tail.

It is important to note that the independent of X and Y implies that λU=0, but the converse is not necessarily true. λU describes only the dependence level at the tails.

Examples of extreme correlation

Longin and Solnik (2001) and Gkillas and Longin (2019) employ the logistic model for the dependence function of the Gumbel copula (also called the Gumbel-Hougaard copula) for Fréchet margins, as follows:

This model contains the special cases of asymptotic independence and total dependence. It is parsimonious, as we only need one parameter to model the bivariate dependence structure of exceedances, i.e., the dependence parameter α with 0<α≤1. The correlation of exceedances ρ (also called extreme correlation) can be computed from the dependence parameter α of the logistic model as follows: ρ= 1-α^2. The special cases where α is equal to 1 and α converges towards 0 correspond to asymptotic independence, in which ρ is equal to 0, and total dependence, in which ρ is equal to 1, respectively (Tiago de Oliveira, 1973).

Related posts on the SimTrade blog

About extreme value theory

   ▶ Shengyu ZHENG Extreme Value Theory: the Block-Maxima approach and the Peak-Over-Threshold approach

   ▶ Shengyu ZHENG Optimal threshold selection for the peak-over-threshold approach of extreme value theory

   ▶ Gabriel FILJA Application de la théorie des valeurs extrêmes en finance de marchés

Useful resources

Academic resources

Gkillas K. and F. Longin (2018) Is Bitcoin the new digital Gold?, Working paper, ESSEC Business School.

Longin F. (2016) Extreme events in finance: a handbook of extreme value theory and its applications Wiley Editions.

Longin F. and B. Solnik (2001) Extreme Correlation of International Equity Markets, The Journal of Finance, 56, 649-676.

Zeevi A. and R. Mashal (2002) Beyond Correlation: Extreme Co-Movements between Financial Assets. Available at SSRN: https://ssrn.com/abstract=317122

Other resources

Extreme Events in Finance

Rieder H. E. (2014) Extreme Value Theory: A primer (slides).

About the author

The article was written in January 2024 by Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024).

Securities and Exchange Board of India (SEBI)

Securities and Exchange Board of India (SEBI)

Nithisha CHALLA

In this article, Nithisha CHALLA (ESSEC Business School, Grande Ecole – Master in Management (MiM), 2021-2024) presents the Securities and Exchange Board of India (SEBI) which is empowering markets and ensuring integrity.

Introduction to SEBI

The Securities and Exchange Board of India (SEBI) serves as a regulator over the country’s financial markets and has a significant impact on how the economy of the country is shaped. Established in 1988, SEBI’s regulatory authority is responsible for a broad range of activities, including promoting open and honest market processes and protecting investors’ rights and interests. Protecting investors’ rights and interests is SEBI’s main goal. Market manipulation, insider trading, and other fraudulent activities are also in the scope of the regulatory authority. Investors receive reliable and timely information to help them make informed decisions thanks to SEBI’s strict standards and requirements for listed companies on Indian exchanges. This emphasis on openness and disclosure encourages investor trust, which increases market activity.

Logo of Securities and Exchange Board of India.  Logo of Securities and Exchange Board of India
Source: SEBI.

Market development and innovation

The purpose of SEBI goes beyond simple regulation; it also actively promotes market expansion and innovation. SEBI has broadened the investment options available to both institutional and individual investors by introducing mutual funds, derivatives, and alternative investment vehicles. These cutting-edge financial products have expanded the investment landscape and drawn institutional investors from abroad, helping India integrate into the world financial markets.

A barrier to malpractices is SEBI’s effective market surveillance systems. To identify and stop market manipulation, SEBI uses an integrated surveillance system to track trade patterns, price changes, and unusual activity. Its ability to punish offenders shows how committed it is to upholding market integrity.

Global Integration and Investor Confidence

Market-friendly policies and international acclaim have been won by SEBI’s regulatory initiatives. Increased foreign direct investment, portfolio investment, and institutional investor activity in Indian markets are the results of this. India’s reputation as a desirable investment location is greatly influenced by SEBI’s role in establishing a favorable investment climate.

While SEBI’s achievements are noteworthy, it faces challenges such as the rapid pace of technological advancements, ensuring effective implementation of regulations, and maintaining a balance between innovation and investor protection. Moreover, as the financial markets evolve, SEBI’s role in regulating emerging areas like cryptocurrencies and digital assets becomes increasingly critical.

Conclusion

The distinctiveness of SEBI rests not only in its ability to regulate, but also in its innovative projects that go beyond conventional regulatory functions. The SEBI stands as a testament to India’s regulatory foresight, from empowering investors through cutting-edge processes to stimulating innovation while safeguarding investor protection. Its dedication to sustainability, education, and technology-driven surveillance distinguishes it as a regulatory pathfinder that keeps up with changes in the financial world.

Why should I be interested in this post?

For a Master in Management student like me, delving into SEBI’s operations provides a real-world context to the theories we study. Understanding SEBI’s unique initiatives, such as the Regulatory Sandbox (a framework that allows businesses, especially in the financial technology sector, to test innovative products, services, business models in a controlled environment) and its emphasis on sustainability, offers insights into modern regulatory challenges and innovative solutions. Exploring SEBI’s role in investor protection and market integrity enhances my grasp of ethical governance and responsible business practices. SEBI’s dynamic approach aligns with the multidisciplinary nature of my studies, allowing me to connect theoretical knowledge with practical implications in the financial world.

Related posts on the SimTrade blog

   ▶ All posts about financial techniques

   ▶ Akshit GUPTA Securities and Exchange Commission (SEC)

   ▶ Akshit GUPTA Autorité des Marchés Financiers (AMF)

Useful resources

SEBI What’s new in SEBI?

About the author

The article was written in January 2024 by Nithisha CHALLA (ESSEC Business School, Grande Ecole – Master in Management, 2021-2024).

Trading strategies based on market profiles and volume profiles

Trading strategies based on market profiles and volume profiles

Michel Henry VERHASSELT

In this third article on a series on market profiles, Michel Henry VERHASSELT (ESSEC Business School – Master in Finance, 2023-2025) explains trading strategies based on market profiles and volume profiles.

Introduction

We have defined and seen illustrations of all the key concepts related to both market profiles and volume profiles. Let us now look at their practical applications and trading strategies that may be applied.

These techniques apply to both market profiles and volume profiles.

Mean reversion

A mean reversion strategy is a trading approach based on the idea that prices tend to revert to their historical average or mean over time. Traders employing this strategy look for opportunities to enter trades when prices deviate significantly from their historical average, anticipating a return to the mean.

Market profiles naturally fit this kind of strategy, as their whole point is to show where participants have deemed the price to be fair. For example, a trader could consider that when the price is trading below a high-volume area, that area will act as a magnet to pull the price up. The prices in that region were indeed considered fairer, and the current low price would be an anomaly to be corrected by market participants. Therefore, the trader would buy at the current price and sell around the POC or at least within the value area.

Resistance and support

Conversely, a different interpretation within the same framework involves viewing these highly-traded areas as potential resistance or support zones. Support is a crucial level preventing an asset from further decline, often due to an upsurge in buying interest. In contrast, resistance is a pivotal level inhibiting an asset from rising higher, typically caused by intensified selling activity.

For a trader emphasizing resistance and support concepts, consider a rising price nearing a heavily traded zone encountering resistance, similar to reaching a ceiling. The outcome may lead to either a breakout to new highs or a reversal downward. In this context, the value area is not seen as a magnetic force drawing prices toward fair value; instead, it functions as a testing ground. The result hinges on whether the attempt to breach resistance is rejected, leading to a lower price, or successful, resulting in an upward move past this pivotal point. This dynamic interaction adds layers of complexity to mean reversion and support/resistance strategies within the realm of market profiles.

Entries and exits

More generally, traders employ various tools to make well-informed decisions about when to enter or exit market positions. One such powerful tool is the market profile. Even if a trader’s primary strategy relies on other triggers to look at a trade, say for example macro events, they can still leverage market profiles. These profiles help determine optimal entry or exit points, considering factors like obtaining liquidity with minimal market impact and identifying levels for stop losses and target profits based on perceived resistance and support.

Breakouts

As mentioned above, breakout trading is a strategy employed in financial markets where traders capitalize on significant price movements beyond established levels of support or resistance. In a breakout, the price surpasses a predefined range or pattern, triggering potential buying or selling signals. Traders often interpret breakouts as indicators of strong momentum, with the expectation that the price will continue moving in the breakout direction. The aim of breakout trading is to enter positions early in a new trend and ride the momentum for profitable gains.

Market profile can help identify breakout opportunities. For example, when a market exhibits confined trading within a narrow range and the profile reveals an accumulation of TPOs (Time Price Opportunities) near the boundaries of this range, a breakout surpassing these levels could indicate a potential trading opportunity.

False breakout strategy

The false breakout trading strategy relies on discerning instances where the price briefly moves beyond a trading range but subsequently retraces, indicating potential weaknesses in the current trend. In a false bullish breakout, signaling buyers’ weakness, traders might opt for short positions. Conversely, in retraced bearish breakouts, suggesting sellers’ uncertainty, opportunities for long positions may emerge. The effectiveness of this strategy lies in recognizing imbalances in supply and demand, a task facilitated by market profiles.

Market profiles offer a nuanced visual representation of price movements over time, highlighting areas of significant trading activity and the distribution of volume at different price levels. This information aids traders in identifying potential entry and exit points more precisely. By integrating market profiles into the false breakout strategy, traders gain insights into the dynamics of supply and demand within specific price ranges. This, in turn, enhances their ability to navigate market sentiment shifts and make informed decisions, contributing to the overall effectiveness of the false breakout trading strategy.

Single prints

The Market Profile Single Print strategy is a dynamic approach leveraging the unique concept of single prints within the Market Profile chart to identify potential breakout opportunities.

The strategy’s foundation lies in identifying single prints—instances where a price level remains untouched throughout the trading session, creating a gap in the Market Profile chart. Price can often revisit these areas to test these inefficiencies. These single prints therefore act as crucial markers, indicating potential areas of support or resistance. The significance of this lies in the ability to pinpoint breakout levels: a break above a single print suggests a bullish breakout, while a break below indicates a bearish breakout.

Crucially, market profiles assist in managing risk effectively by providing a visual representation of potential areas of support or resistance. Continual monitoring of the trade is emphasized, with adjustments made based on evolving market conditions. Trailing stop-loss orders are recommended to protect profits as the trade progresses favorably.

Related posts on the SimTrade blog

   ▶ Michel VERHASSELT Market profiles

   ▶ Michel VERHASSELT Difference between market profiles and volume profiles

   ▶ Theo SCHWERTLE Can technical analysis actually help to make better trading decisions?

   ▶ Theo SCHWERTLE The Psychology of Trading

   ▶ Clara PINTO Strategy and Tactics: From military to trading

Useful resources

Steidlmayer P.J. and S.B. Hawkins (2003) Steidlmayer on Markets: Trading with Market Profile, John Wiley & Sons, Second Edition;

Steidlmayer P.J. and K. Koy (1986) Markets and Market Logic: Trading and Investing with a Sound Understanding and Approach, Porcupine Press.

About the author

The article was written in December 2023 by Michel Henry VERHASSELT (ESSEC Business School – Master in Finance, 2023-2025).

Difference between market profiles and volume profiles

Difference between market profiles and volume profiles

Michel Henry VERHASSELT

In this second article on a series on market profiles, Michel Henry VERHASSELT (ESSEC Business School – Master in Finance, 2023-2025) explains the difference between market profiles and volume profiles.

Comparison

Both Market Profiles and Volume Profiles follow the auction theory of markets. According to this theory, price, time and volume are the three processes through which trading takes place.

More exactly:

  • Price advertises all opportunities. It lets the participants know that they can buy or sell an asset at a given price; it tells them what their opportunities are.
  • Time regulates all opportunities. Indeed, the opportunities given by price are limited in time; they are ephemeral and depend on the liquidity and volatility of an asset, in other words, how much time it takes for the price to change and the opportunity to vanish.
  • Volume measures the success or failure of advertised opportunities. Volume reflects the degree of market participation and validates the relevance of the opportunities presented. If an opportunity is advertised and becomes successful that means many participants agree on the fairness of this opportunity and a relatively significant amount of trading activity (volume) takes place at this price. A price that is not accepted over time is, in fact, rejected: the advertisement has failed.

All traders feel the pressure of time ticking away during a trade. When a trade stalls and doesn’t go as expected, it can create doubts, especially the longer it remains stagnant. The constant tick of the clock forces traders to ponder what might be going wrong. For instance, the late liquidation or short-covering rally in the pit session may be due to day traders running out of time rather than a lack of trading volume. In that sense, volume must take place within a given time range to validate the price advertisement.

Now when it comes to Volume Profiles, the chart shows the distribution of volume at different price levels, kind of like a visual map of where the action is happening. It uses a vertical histogram to make it easy for traders to see where the most trading activity is concentrated. This charting tool is all about giving traders a closer look at how much trading is going on at different price points over time.

Comparing Volume Profile to Market Profile, we find three key areas of differences: analytical focus, representation of data, and time and price dynamics.

Analytical Focus

Volume Profile: As the name suggests, Volume Profile places a paramount emphasis on volume, aiming to dissect the distribution of trading activity at different price levels over a designated timeframe.

Market Profile: In contrast, Market Profile combines time and price to create a graphical representation of market behavior. It divides price movements into designated time segments, typically 30-minute intervals, offering a nuanced perspective on the interplay between time and price.

Representation of Data

Volume Profile: The chart generated by Volume Profile provides a clear visualization of how volume is distributed across various price levels, offering insights into where significant buying or selling activity is concentrated.

Market Profile: While also representing volume, Market Profile charts use letters (TPOs) to signify the time spent at specific price levels, creating a distinctive visual pattern resembling a probability distribution.

Time and Price Dynamics

Volume Profile: Its primary concern is the interrelation of volume and price, with a focus on understanding the significance of different price levels based on the amount of trading activity.

Market Profile: Integrates time as a crucial factor, providing traders with a holistic view of market behavior over specific time intervals. This temporal dimension aids in identifying periods of heightened activity and potential areas of interest.

Let’s now look at Market and Volume profiles graphs.

Illustration

The figure below is taken from Steidlmayer’s main work: “Steidlmayer on Markets, Trading with Market Profile”. Each letter (A, B, C, D, etc.) corresponds to a single timeframe of 30 minutes. The condensed triangle-shaped figure shows where price has moved throughout the entire time period according to the trading activity.

Market profile.
Market profile
Source: Steidlmayer’s book “Steidlmayer on Markets, Trading with Market Profile”.

If we rotate the figure, we get a bell-shaped pattern that looks like a normal distribution.

Market profile (reversed presentation).
Market profile
Source: Steidlmayer’s book “Steidlmayer on Markets, Trading with Market Profile”.

The price distribution in a Market Profile tends to exhibit a bell-shaped pattern due to the nature of market dynamics and participant behavior. In a well-functioning and liquid market, prices are subject to constant fluctuations driven by the interplay of buying and selling activities and the bell-shaped distribution is simply a reflection of the statistical tendency of prices to cluster around a central point. The majority of trading activity should in theory occur around a fair or equilibrium price. As you move away from this central point, the occurrences of extreme price levels decrease, forming the characteristic bell curve. It is a visual representation of the market’s natural inclination to spend more time around prices that are deemed fair.

The figure below represents the volume profiles of the BTC/USDT pair on Binance’s futures market from December 8 until December 15, 2023.

Volume profile.
Volume profile
Source: exocharts.com.

We see the point of control (POC) that corresponds to the most traded price as a red line extending through the volume profile of each day. The value area is marked both by a whiter grey and dotted lines. The current price is a green line on the far left. On the far right, we find the volume profile for the whole timeframe displayed on the screen, with its own value area and point of control.

While the two profiles are very similar, however instead of looking at price and time as in a market profile, the volume profile focuses on volume. First, the volume profile is indifferent to when exactly a given trade took place within the same timeframe, here a day. Second, the volume profile uses true volume data rather than simply whether or not a trade took place. The length of each bar within a volume profile is directly proportionate to the volume of the trades at that price. In contrast, the market profile does not show the size of the trades but simply shows whether or not a price was traded during a 30-minute period, and then aggregates (or “collapses”) the data to form one profile, as we saw in the bell-shaped curve above.

Why should I be interested in this post?

Students of finance interested in financial markets and trading would be the target audience of this post. I believe this technique to be relatively obscure despite its long history. We rarely see asset charts displayed as histograms as an effort to understand market behavior and participant psychology. I believe it is fundamental to consider that the market is made up of human actors, that these actors have their biases on price and value, and in turn that these biases’ success is represented as a function of volume. Even if a student does not subscribe to this understanding of markets, it would broaden his/her perspective and allow him/her to understand trading more generally.

Related posts on the SimTrade blog

   ▶ Michel VERHASSELT Market profiles

   ▶ Michel VERHASSELT Trading strategies based on market profiles and volume profile

   ▶ Theo SCHWERTLE Can technical analysis actually help to make better trading decisions?

   ▶ Theo SCHWERTLE The Psychology of Trading

   ▶ Clara PINTO Strategy and Tactics: From military to trading

Useful resources

Steidlmayer P.J. and S.B. Hawkins (2003) Steidlmayer on Markets: Trading with Market Profile, John Wiley & Sons, Second Edition;

Steidlmayer P.J. and K. Koy (1986) Markets and Market Logic: Trading and Investing with a Sound Understanding and Approach, Porcupine Press.

TPO versus Volume Profiles

Trader Dale Volume Profile vs. Market Profile – What Is The Difference? YouTube video

About the author

The article was written in December 2023 by Michel Henry VERHASSELT (ESSEC Business School – Master in Finance, 2023-2025).

Market profiles

Market profiles

Michel Henry VERHASSELT

In this first article on a series on market profiles, Michel Henry VERHASSELT (ESSEC Business School – Master in Finance, 2023-2025) explains the history behind this concept and defines its central themes.

Introduction

The concept of Market Profiles emerged as a response to the dynamic nature of financial markets, where prices are in constant flux due to the continuous flow of information. Peter Steidlmayer, a trader at the Chicago Board of Trade during the 1960s and 1970s, sought to develop a charting method that could capture the interplay between price and volume, reflecting the idea that, despite the constant price changes, there should be a fair value around which prices revolve at any given time.

In traditional charting methods like bar charts and candle charts, the emphasis is typically on plotting price against time. Steidlmayer, however, wanted to make volume immediately apparent on the chart. This emphasis on volume is crucial because it provides insights into the level of participation and conviction among market participants.

The development of Market Profile was influenced by various theories and disciplines. In particular, it drew inspiration from the concept of value investing articulated by Benjamin Graham and David Dodd, the statistical bell curve, and John Schultz’s work on minimum trend. By combining these influences, Steidlmayer aimed to create a charting technique that would not only reveal price movements but also offer a visual representation of the market’s perception of value.

Market Profile, as a charting technique, differs significantly from traditional methods. Instead of using standard bar charts with prices plotted against time, Market Profile organizes data in a way that reflects the distribution of prices at different levels. Each time period is represented by a separate column, with prices displayed in ascending order on the vertical axis. This organization provides a visual representation of how much time the market spent at different price levels, creating a histogram-like structure.

The resulting chart, with letters (A, B, C, D, etc.) representing Time Price Opportunities (TPO), helps traders identify key areas such as the Value Area (where the majority of trading activity occurred), the Point of Control (the most traded price level), and Single Prints (indicating areas of price discovery). These elements collectively contribute to a comprehensive understanding of market dynamics and help traders make more informed decisions.

Definitions

We define below the key terms to understand Market Profile: Volume, Value Area, and Point of Control.

Volume

Volume in the context of financial markets refers to the number of contracts or shares traded at during a specific time period. Volume is a crucial component in Market Profile analysis because it provides insights into the level of participation and conviction among market participants. High volume at a particular price level suggests a significant level of interest or agreement on the value of the asset at that point.

Volume helps us shape the Time Price Opportunities. A TPO represents a unit of time and price on a Market Profile chart. Each 30-minute period (or another specified time frame) is represented by a letter, forming a vertical histogram on the price axis. TPOs help visualize the distribution of trading activity at different price levels over time. By organizing price data into these time brackets, traders can identify patterns, trends, and areas of importance, contributing to a better understanding of market behavior.

Value Area

The Value Area represents the range of price levels that contain a specific percentage of the total traded volume (usually 70% of the day’s trading activity). Traders also use the Upper Value Area (where 15% of the volume is located above) and the Lower Value Area (where 15% of the volume is below), with the area in between considered the “fair value” zone. It helps traders identify the price levels that are deemed fair by the market. It provides insights into where the majority of trading activity occurred, offering potential support and resistance zones for future price movements.

Point of Control

Within the value area, we find the Point of Control. The Point of Control is the price level at which the most TPOs occurred during a specific time period. It is considered a point of balance and represents the price where the market found the most acceptance. It indicates the price level that had the most trading activity, suggesting a level of equilibrium where buyers and sellers found agreement. Traders often monitor the POC for potential shifts in market sentiment.

By understanding the interplay between these elements, traders can gain valuable insights into market dynamics, identify key support and resistance zones, and make more informed decisions in their trading strategies.

With this background and definitions, we can look further into the practice of market profiles and its closely related concept, volume profiles.

Why should I be interested in this post?

Students of finance interested in financial markets and trading would be the target audience of this post. I believe this technique to be relatively obscure despite its long history. We rarely see asset charts displayed as histograms as an effort to understand market behavior and participant psychology. I believe it is fundamental to consider that the market is made up of human actors, that these actors have their biases on price and value, and in turn that these biases’ success is represented as a function of volume. Even if a student does not subscribe to this understanding of markets, it would broaden his/her perspective and allow him/her to understand trading more generally.

Related posts on the SimTrade blog

   ▶ Michel VERHASSELT Difference between market profiles and volume profiles

   ▶ Michel VERHASSELT Trading strategies based on market profiles and volume profile

   ▶ Theo SCHWERTLE Can technical analysis actually help to make better trading decisions?

   ▶ Theo SCHWERTLE The Psychology of Trading

   ▶ Clara PINTO Strategy and Tactics: From military to trading

Useful resources

Steidlmayer P.J. and S.B. Hawkins (2003) Steidlmayer on Markets: Trading with Market Profile, John Wiley & Sons, Second Edition;

Steidlmayer P.J. and K. Koy (1986) Markets and Market Logic: Trading and Investing with a Sound Understanding and Approach, Porcupine Press.

Letian Wang (2020) Using Python for Market Profiles

About the author

The article was written in December 2023 by Michel Henry VERHASSELT (ESSEC Business School – Master in Finance, 2023-2025).

Impact du contrôle de gestion sur l’entreprise

Impact du contrôle de gestion sur l’entreprise

Medine ACAR

Dans cet article, Medine ACAR (ESSEC Business School, Programme Bachelor in Business Administration (BBA), 2020-2024) analyse l’impact du contrôle de gestion dans l’entreprise.

Introduction

Le contrôle de gestion est une fonction clé en entreprise, axée sur la performance et l’efficacité. Il implique la planification, la mesure et l’analyse des activités pour aligner les performances avec les objectifs stratégiques de l’entreprise. Ce processus inclut la budgétisation, la prévision financière, et l’analyse des écarts entre les résultats réels et les prévisions. Le contrôle de gestion aide également à identifier les opportunités d’amélioration et à mettre en œuvre des stratégies correctives pour optimiser les opérations et les coûts. Entre autres, le contrôle de gestion assure la santé et la viabilité des entreprises. Allons plus loin.

Amélioration de la Performance et de la Prise de Décision

Le contrôle de gestion, au cœur des stratégies d’entreprise, joue un rôle déterminant dans l’analyse et l’amélioration des performances financières. Il offre une perspective claire sur les forces et faiblesses de l’organisation, permettant ainsi une prise de décision plus stratégique et éclairée. Des études de cas dans divers secteurs, telles que celles menées sur des entreprises comme IBM ou General Electric, illustrent comment l’application rigoureuse du contrôle de gestion peut entraîner une transformation significative dans la performance et la gestion des ressources. Par exemple, l’implémentation par GE des pratiques « Six Sigma » et de gestion Lean sous la direction de Jack Welch a conduit à des améliorations substantielles de l’efficacité opérationnelle et de la réduction des coûts. (Etude de cas: General Electric’s Two-Decade Transformation Under the Leadership of Jack Welch).

Gestion des Risques et Assise de la Durabilité

Au-delà de la simple surveillance financière, le contrôle de gestion est essentiel pour la gestion des risques et la durabilité à long terme de l’entreprise. Il permet d’identifier les risques potentiels, tant financiers qu’opérationnels, et de mettre en place des stratégies pour les atténuer. Des recherches menées dans le domaine bancaire, par exemple, mettent en lumière l’importance de cette fonction pour prévenir les crises financières et assurer une stabilité continue.

L’étude “Management controls and crisis: evidence from the banking sector” menée par Pall Rikhardsson, Carsten Rohde, Leif Christensen, Catherine E. Batt en 2021, sur l’utilisation des contrôles de gestion lors de la crise financière de 2008 dans six banques a révélé que l’emploi à la fois de contrôles de gestion organiques et mécanistes était essentiel pour gérer le changement.

Ces contrôles jouent trois rôles principaux :

  • Guider et contrôler le comportement
  • Changer les perceptions internes et externes
  • Assurer la responsabilité.

Résumé

Le contrôle de gestion n’est pas seulement un outil de surveillance financière ; c’est un levier stratégique qui influence profondément la performance, la prise de décision, la gestion des risques et, en fin de compte, la durabilité de l’entreprise. Les études dans ce domaine confirment son rôle inestimable dans le succès et la pérennité des entreprises à travers le monde.

Autres articles sur le blog

   ▶ Jessica BAOUNON Enjeux de la pratique de la pleine conscience et de l’intelligence émotionnelle dans la fonction de contrôle de gestion

   ▶ Chloé POUZOL Contrôle de gestion chez Edgar suites

   ▶ Emma LAFARGUE Contrôle de gestion chez Chanel

Ressources utiles

Robert Obert et Marie-Pierre Mairesse (2008) “Le Contrôle de Gestion: Organisation et Mise en Œuvre”, Dunod.

Case Study: General Electric’s Two-Decade Transformation Under the Leadership of Jack Welch

6 sigma (2017) General Electric (GE) et Six Sigma

Henderson, K.M. and Evans, J.R. (2000) “Successful implementation of Six Sigma: benchmarking General Electric Company”, Benchmarking: An International Journal, 7(4): 260-282.

Karim Saïd and Soufiane Kherrazi (2021) Du contrôle de gestion à l’innovation dans le contrôle HBR France

Rikhardsson, P., Rohde, C., Christensen, L. et Batt, C.E. (2021) “Management controls and crisis: evidence from the banking sector” Accounting, Auditing & Accountability Journal, 34(4): 757-785.

A propos de l’auteure

L’article a été rédigé en décembre 2023 par Medine ACAR (ESSEC Business School, Programme Bachelor in Business Administration (BBA), 2020-2024).

Volume-Weighted Average Price (VWAP)

Volume-Weighted Average Price (VWAP)

Raphael TRAEN

In this article , Raphael TRAEN (ESSEC Business School, Global BBA, 2023-2024) explains about the Volume-Weighted Average Price (abbreviated as VWAP), a statistic used by traders to determine the average trading taking into account transaction volume.

Definition

The volume-weighted average price (VWAP) is a measurement that shows the average price of a security, adjusted for its volume. It is calculated during a specific trading session by taking the total dollar value of trading in the security (sum of the products of the price by the quantity of each trade during the trading session) and dividing it by the total volume of trades (sum of the quantities of each trade during the trading session). The formula for calculating VWAP is given by

Formula VWAP

Where N is the number of transactions during the trading session (trading day).

VWAP can also be computed for consecutive time intervals during the trading sessions.

Sometimes, the price is replaced by a “typical price” computed as the average of the minimal price, maximal price, and closing price observe over a time interval.

Typical price

Interpreting the VWAP indicator / Key takeaways

Volume-weighted average price (VWAP) is a popular technical indicator used by traders and investors to identify trends, support and resistance levels, and potential entry and exit points. It can also be used for example to assess the liquidity and market depth of a security. If the VWAP is closely clustered around the current price, it suggests that there is a lot of liquidity and that the market is well-balanced. If the VWAP is spread out over a wide range of prices, it suggests that the market is less liquid and that there is a higher risk of wide price swings.

Breakout above the VWAP line suggests a bullish trend

A breakout above VWAP suggests that the price has momentum and is moving upwards. This could be due to increased buying pressure from investors, indicating a shift in sentiment towards the security. Once the price breaks above VWAP, it can act as a support level, making it more difficult for the price to fall below that level.

This could be an opportunity to enter a long position, anticipating the price to continue rising.

Breakdown below the VWAP line suggests a bearish trend

If the price of a security breaks below the VWAP line, it may signal a potential bearish trend. This could be an opportunity to enter a short position, anticipating the price to continue falling.

VWAP line can act as support or resistance level

The VWAP line can also function as a support or resistance level, representing a price range where the price of the security may tend to bounce off.

VWAP to identify trends

If the VWAP line is trending upwards, it suggests an overall upward trend in the price of the security. This could indicate favorable conditions for long-term investments. Conversely, if the VWAP line is trending downwards, it suggests an overall downward trend in the price of the security. This could indicate caution for long-term investments.

Conclusion

It is important to note that VWAP is just one indicator, and it should not be used in isolation. It is always a good idea to consider other technical indicators, such as the moving average convergence divergence (MACD) and the relative strength index (RSI), before making any trading decisions.

Often, multiple interpretations are possible and because of this, it is important to use the VWAP in combination with other indicators.

As I said, a breakdown below the VWAP may suggest a bearish trend. But it can also be interpreted as the following: Stocks with prices below the VWAP are considered as undervalued and those with prices above it, overvalued.

So while some institutions may prefer to buy when the price of the security is below the VWAP or sell when it is above, VWAP is not the only factor to consider. In strong uptrends, the price may continue to move higher for many days without dropping below the VWAP at all. Therefore, waiting for the price to fall below the VWAP could mean a missed opportunity if prices are rising quickly.

Why should I be interested in this post?

This article will provide students interested in business and finance a comprehensive overview of VWAP and how it is used by traders and investors. By understanding this fundamental concept in technical analysis, students will gain a valuable tool for making informed investment decisions.

Related posts on the SimTrade blog

   ▶ Shruti CHAND Technical analysis

   ▶ Shruti CHAND Technical Analysis, Moving Averages

   ▶ Theo SCHWERTLE Can technical analysis actually help to make better trading decisions

   ▶ Giovanni PAGLIARDI Tail relation between return and volume

Useful resources

Academic articles

Menkhoff, L. (2010) The use of technical analysis by fund managers: International evidence, Journal of Banking & Finance 34(11): 2573-2586.

Kirkpatrick II, C. D., and J.R. Dahlquist (2010) Technical Analysis: The Complete Resource for Financial Market Technicians. FT press.

Videos

Humbled Trader VWAP Trading Strategy Crash Course (YouTube video)

MHFIN VWAP Explained For Beginners In Under 5 Minutes (YouTube video)

About the author

The article was written in December 2023 by Raphael TRAEN (ESSEC Business School, Global BBA, 2023-2024).

Understanding Correlation in the Financial Landscape: How It Drives Portfolio Diversification

Understanding Correlation in the Financial Landscape: How It Drives Portfolio Diversification

Raphael TRAEN

In this article, Raphael TRAEN (ESSEC Business School, Global BBA, 2023-2024) delves into the fascinating world of correlation and its profound impact on diversification strategies in the financial realm. Understanding correlation is crucial for crafting well-diversified investment portfolios that can effectively mitigate risk and enhance overall performance (the famous trade-off between risk and expected return).

Statistical correlation

Definition

Statistical correlation is a quantitative measure of the strength and direction of the linear relationship between two variables. It describes how two variables are related to each other and how one variable changes in response to the other (but remember that correlation is not causality!).

Mathematically (or more precisely statistically), correlation is defined by the following formula:

Correlation formula

where ρ1,2 is the correlation coefficient between the two random variables (say X1 and X2), 𝜎1,2 the covariance between the two random variables, and 𝜎1 and 𝜎2 are the standard deviation of each random variable.

Correlation is measured on a scale from -1 to +1, with -1 representing a perfect negative correlation, +1 representing a perfect positive correlation, and 0 representing no correlation.

Correlation vs Independence

Correlation and independence are two statistical measures that describe the relationship between two variables. As already mentioned, correlation quantifies the strength and direction of the relationship, ranging from perfect negative (one variable decreases as the other increases) to perfect positive (both variables increase or decrease together). Independence on the other hand indicates the absence of any consistent relationship between the variables.

If two random variables are independent, their correlation is equal to zero. But if the correlation between two random variables is equal to zero, it does not necessarily mean that they are independent. This can be illustrated with an example. Let us consider two random variables, X and Y, defined as follows: X is a random variable that takes discrete values from the set {-1, 0, 1} with equal probability (1/3) and Y is defined as Y = X2.

E(X) = 0, as the expected value of X is (1 + 0 + (-1))/3 = 0
E(Y) = E(X2) = (12 + 02 + (-1)2)/3 = 2/3
E(XY) = (-1 * 1 + 0 * 0 + 1 * 1)/3 = 0

Cov(X, Y) = E(XY) – E(X)E(Y) = 0 – 0 * (2/3) = 0

As Corr(X, Y) is equal to Cov(X, Y) / (sqrt(Var(X)) * sqrt(Var(Y))), we find that Corr(X, Y) = 0.

Application in finance

We now consider a financial application : the construction of portfolios. We show that correlation is a key input when building portfolios.

If the concept of portfolios is completely new to you, I recommend first reading through the article by Youssef LOURAOUI about Portfolio.

Portfolio with two assets

In the world of investments, understanding the expected return and variance of a portfolio is crucial for informed decision-making. These two statistical measures provide valuable insights into the potential performance and risk of a collection of assets held together. In what follows, we first focus on a portfolio consisting of two assets.

Return and expected return of a portfolio

The return of a two-asset portfolio P is computed as

Return two assets

where w1 and w2 are the weights of the two assets in the portfolio and R1 and R2 are the returns of the two assets.

The expected return of the two-asset portfolio P is computed as

Expected return two assets

where w1 and w2 are the weights of the two assets in the portfolio and μ1 and μ2 are the expected returns of the two assets.

Risk of a portfolio

The standard deviation (squared root of the variance) of a two-asset portfolio is computed as

Standard deviation of the return of a two-asset portfolio

or

Standard deviation of the return of a two-asset portfolio

where w1 and w2 are the weights of the two assets in the portfolio, 𝜎1 and 𝜎2 are the standard deviations of the returns of the two assets, and 𝜎1,2 and ρ1,2 are the covariance and correlation coefficient between the two assets returns.

The first expression uses the covariance 𝜎1,2 and the second expression the correlation ρ1,2.

Impact of correlation on diversification (the case of two assets)

From the above formulas follows a very interesting theorem called the “Diversification effect” which says the following: with two assets, suppose the weights of both securities are positive. As long as the correlation coefficient is less than 1, the standard deviation of a portfolio of two securities is less than the weighted average of the standard deviation deviations of the individual securities. Investors can obtain the same level of expected return with lower risk.

The figures below illustrate the impact of the correlation between the two assets on portfolio diversification and the efficient portfolio frontier. For a given level of portfolio risk, the lower the correlation, the higher the expected return of the portfolio.

Impact of the correlation on portfolio diversification

Impact of the correlation on portfolio diversification

Impact of the correlation on portfolio diversification

Impact of the correlation on portfolio diversification

Impact of the correlation on portfolio diversification

You can download below an Excel file (from Prof. Longin’s course) that illustrates the impact of correlation on portfolio diversification.

Excel file on impact of correlation

Diversification effect (extension to several assets)

With many assets, suppose the weights of all securities are positive. As long as the correlations between pairs of securities are less than 1, the standard deviation of a portfolio of many assets is less than the weighted average of the standard deviations of the individual securities.

Why should I be interested in this post?

Understanding correlation is an essential skill for any investor seeking to build a well-diversified portfolio that can withstand market volatility and achieve long-term growth. By carefully analyzing correlation dynamics and incorporating correlation analysis into their investment strategies, investors can effectively manage risk exposure and build resilient portfolios that can weather market storms and emerge stronger on the other side.

Related posts on the SimTrade blog

   ▶ Youssef LOURAOUI Portfolio

   ▶ Jayati WALIA Standard deviation

   ▶ Youssef LOURAOUI Hedge fund diversification

   ▶ Lou PERRONE Navigating the Balance Between Risk and Reward in Finance

Useful resources

Prof. Longin’s ESSEC Master in Management “Fundamentals of finance” course.

William Pouder’s ESSEC BBA “Finance” course.

About the author

The article was written in December 2023 by Raphael TRAEN (ESSEC Business School, Global BBA, 2023-2024).

Ethereum – Unleashing Blockchain Innovation

Ethereum – Unleashing Blockchain Innovation

 Snehasish CHINARA

In this article, Snehasish CHINARA (ESSEC Business School, Grande Ecole Program – Master in Management, 2022-2024) explains Bitcoin which is considered as the mother of all cryptocurrencies.

Historical context and background

Ethereum is a groundbreaking blockchain platform that emerged in the wake of Bitcoin’s success in 2015. While Bitcoin introduced and popularized the blockchain concept, Ethereum has leveraged this technology more effectively than any other digital currency. Promoters of new projects tend to rely on Ethereum’s tools rather than embark on the lengthy and expensive process of developing a new blockchain. Ethereum was conceived by a young Canadian programmer, Vitalik Buterin, who saw limitations in Bitcoin’s functionality and envisioned a decentralized platform capable of executing smart contracts. Buterin’s idea gained traction in the cryptocurrency community, and he, along with a team of developers, published the Ethereum whitepaper in late 2013. The platform’s official development began in 2014, with a crowdfunding campaign that raised over $18 million in Bitcoin, making it one of the most successful initial coin offerings (ICOs) of its time. Ethereum’s genesis block was mined on July 30, 2015, marking the official launch of the network.

Ethereum’s innovative concept of smart contracts and decentralized applications (DApps) quickly garnered attention within the blockchain and cryptocurrency space. The platform introduced a Turing-complete programming language, enabling developers to create a wide array of decentralized applications. Ethereum’s native cryptocurrency, Ether (ETH), serves as both a digital currency and a utility token within the ecosystem. Over the years, Ethereum has undergone several network upgrades to improve scalability and security, most notably the transition from a proof-of-work (PoW) to a proof-of-stake (PoS) consensus mechanism with the Ethereum 2.0 upgrade. This transition aims to address the network’s scalability issues and reduce its energy consumption, positioning Ethereum as a sustainable and versatile blockchain platform for the future. Today, Ethereum continues to play a pivotal role in the blockchain and decentralized finance (DeFi) space, powering a vast array of projects, including NFT platforms, decentralized exchanges, and decentralized applications that have reshaped the way we think about finance and technology.

Ethereum Logo
Ethereum Logo
Source: Yahoo! Finance .

Figure 1. Key Dates in Ethereum History
 Key Dates in Ethereum History
Source: Yahoo! Finance .

Key Features of Ethereum

Smart Contracts

Ethereum is renowned for its pioneering smart contract functionality. Smart contracts are self-executing agreements with predefined rules and conditions, enabling automated and trustless transactions. This feature has broad applications in various industries, including finance, supply chain management, and legal services.

Decentralization

Ethereum operates on a decentralized network of nodes, making it resistant to censorship and single points of failure. This decentralization ensures the security and integrity of the blockchain, with no single entity having control over the network.

Ether (ETH)

Ethereum’s native cryptocurrency, Ether, serves as both a digital currency and a utility token. It’s used to pay for transaction fees, secure the network through staking in Ethereum 2.0, and as a medium of exchange within the ecosystem.

Interoperability

Ethereum is designed to interact with other blockchains and networks, fostering compatibility and collaboration across the blockchain ecosystem. Projects like Polkadot and Cosmos aim to enhance this interoperability.

EVM (Ethereum Virtual Machine)

The Ethereum Virtual Machine is a runtime environment for executing smart contracts. It’s a critical component that ensures the same execution of smart contracts across all Ethereum nodes, making Ethereum’s ecosystem reliable and consistent.

EIPs (Ethereum Improvement Proposals)

Ethereum has a robust governance model for protocol upgrades and improvements, with EIPs serving as the mechanism for proposing and implementing changes. This allows for community-driven innovation and adaptation.

Use Cases of Ethereum

Decentralized Finance (DeFi)

Ethereum is at the heart of the DeFi movement, offering lending, borrowing, trading, and yield farming services through DApps like Compound, Aave, and Uniswap. DeFi has disrupted traditional finance, providing open and inclusive access to financial services.

Non-Fungible Tokens (NFTs)

Ethereum’s ERC-721 and ERC-1155 token standards have fueled the NFT boom. NFTs enable the ownership and trade of unique digital assets, from art and music to virtual real estate and collectibles, all recorded on the blockchain.

Supply Chain Management

Ethereum’s transparent and tamper-proof ledger is used to track and verify the authenticity and provenance of products. This enhances supply chain efficiency and trust, reducing fraud and counterfeiting.

Gaming and Virtual Worlds

Ethereum is the platform of choice for blockchain-based gaming and virtual reality experiences. DApps like Decentraland and Axie Infinity allow users to trade in-game assets and participate in virtual economies.

Tokenization of Assets

Real-world assets, such as real estate, stocks, and commodities, can be tokenized on the Ethereum blockchain, making them more accessible for investment and trading.

Identity Verification

Ethereum can be used to secure and manage digital identities, enhancing privacy and reducing the risk of identity theft.

Social Impact

Ethereum is leveraged for social impact projects, including humanitarian aid distribution, voting systems, and tracking philanthropic donations, ensuring transparency and accountability.

Content Distribution

Ethereum-based projects are exploring decentralized content platforms, enabling creators to have more control over their intellectual property and revenue.

Ethereum’s versatility and ongoing development make it a crucial platform for a wide range of applications, from financial innovation to social change and beyond, driving the evolution of the blockchain and cryptocurrency space.

Technology and underlying blockchain

Ethereum’s underlying technology is rooted in blockchain, a distributed ledger system known for its security, transparency, and decentralization. Ethereum, like Bitcoin, employs a blockchain to record and verify transactions, but it offers a distinct set of features and capabilities that set it apart. At the core of Ethereum’s technology is the Ethereum Virtual Machine (EVM), a decentralized computing environment that executes smart contracts. Smart contracts are self-executing agreements with predefined rules and conditions that automate processes without the need for intermediaries.

Ethereum uses a consensus mechanism known as Proof of Stake (PoS), which is a significant departure from Bitcoin’s Proof of Work (PoW). PoS allows network participants, known as validators, to create new blocks and secure the network by locking up a certain amount of Ether as collateral. This approach is more energy-efficient and scalable compared to PoW, addressing some of the limitations that Bitcoin faces. Ethereum’s blockchain is a public and permissionless network, meaning that anyone can participate, transact, and develop decentralized applications (DApps) on the platform without needing approval.

The Ethereum ecosystem also employs a variety of token standards, with ERC-20 and ERC-721 being the most well-known. ERC-20 tokens are fungible and often used for cryptocurrencies, while ERC-721 tokens are non-fungible and have powered the explosion of NFTs (Non-Fungible Tokens). These standards have facilitated the creation and interoperability of a vast array of digital assets and DApps on the platform. Ethereum’s robust governance model, through Ethereum Improvement Proposals (EIPs), allows the community to suggest and implement changes, ensuring that the platform remains adaptable and responsive to evolving needs and challenges. Ethereum’s groundbreaking technology and active development community have positioned it as a leader in the blockchain space, with far-reaching implications for industries beyond just cryptocurrencies.

Supply of coins

Ethereum initially used a proof-of-work (PoW) consensus algorithm for coin mining, similar to Bitcoin. The process involved miners solving complex mathematical puzzles to validate transactions and add new blocks to the blockchain. Miners competed to solve these puzzles, and the first one to succeed was rewarded with newly minted Ethereum coins (ETH). This process was resource-intensive and required significant computational power.

However, Ethereum has been undergoing a transition to a proof-of-stake (PoS) consensus mechanism as part of its Ethereum 2.0 upgrade. The PoS model doesn’t rely on miners solving computational puzzles but instead relies on validators who lock up a certain amount of cryptocurrency as collateral to propose and validate new blocks. Validators are chosen to create new blocks based on the amount of cryptocurrency they hold and are willing to “stake” as collateral.

This transition to PoS is occurring in multiple phases. The Beacon Chain, which is the PoS blockchain that runs parallel to the existing PoW chain, was launched in December 2020. The full transition to Ethereum 2.0, including the complete shift to PoS, is expected to occur in multiple subsequent phases.

As of Q1 2023, there are approximately 121,826,163.06 Ethereum (ETH) coins in circulation, a key distinction from Bitcoin, which has a capped supply of 21 million. Ethereum, created by Vitalik Buterin, was designed without a specific supply limit, allowing for an unlimited number of coins if mining continues. Despite this, there is a cap of 18 million ETH coins that can be mined annually, equating to around 2 ETH per block. Ethereum Classic (ETC), a separate blockchain resulting from a community dispute, also exists with 135.3 billion coins. The Ethereum blockchain’s size was 175 GB in 2021, considerably smaller than Bitcoin’s 412 GB. Approximately 5750 Ethereum blocks are mined daily, with mining difficulty increasing and around 2,151 active nodes globally, primarily in the USA. Ethereum’s potential to become deflationary is acknowledged, contingent on mining costs exceeding rewards, as stated in a GitHub disclaimer.

Figure 2. Number of Ethereum Transaction per Day
Number of bitcoins in circulation
Source: BitInfoCharts (Ethereum Transactions historical chart).

Historical data for Ethereum

How to get the data?

The Ethereum is the most popular cryptocurrency on the market, and historical data for the Ethereum such as prices and volume traded can be easily downloaded from the internet sources such as Yahoo! Finance, Blockchain.com & CoinMarketCap. For example, you can download data for Ethereum on Yahoo! Finance (the Yahoo! code for Ethereum is ETH-USD).

Figure 4. Ethereum data
 Ethereum data
Source: Yahoo! Finance.

Historical data for the Ethereum market prices

Historical data on the price of Ethereum holds paramount significance in understanding the cryptocurrency’s market trends, investor behavior, and overall performance over time. Analyzing historical price data allows investors, analysts, and researchers to identify patterns, cycles, and potential indicators that may influence future price movements. It provides valuable insights into market sentiment, periods of volatility, and the impact of significant events or developments within the Ethereum ecosystem. Traders use historical data to formulate strategies, assess risk, and make informed decisions. Furthermore, the data aids in evaluating the success of protocol upgrades, regulatory changes, and shifts in broader economic conditions, offering a comprehensive view of Ethereum’s evolution. The historical price data of Ethereum serves as a crucial tool for market participants seeking to navigate the dynamic and sometimes unpredictable nature of the cryptocurrency market.

With the number of coins in circulation, the information on the price of coins for a given currency is also important to compute Ethereum’s market capitalization.

Figure 5 below represents the evolution of the price of Ethereum in US dollar over the period Nov 2017 – Dec 2023. The price corresponds to the “closing” price (observed at 10:00 PM CET at the end of the month).

Figure 5. Evolution of the Ethereum price

Source: Yahoo! Finance.

R program

The R program below written by Shengyu ZHENG allows you to download the data from Yahoo! Finance website and to compute summary statistics and risk measures about the Ethereum.

Download R file

Data file

The R program that you can download above allows you to download the data for the Ethereum from the Yahoo! Finance website. The database starts on December, 2017.

Table 3 below represents the top of the data file for the Ethereum downloaded from the Yahoo! Finance website with the R program.

Table 3. Top of the data file for the Ethereum.
Top of the file for the Ethereum data
Source: computation by the author (data: Yahoo! Finance website).

Python code

You can download the Python code used to download the data from Yahoo! Finance.

Download the Excel file with Ethereum data

Python script to download Ethereum historical data and save it to an Excel sheet::

import yfinance as yf
import pandas as pd

# Define the ticker symbol for Ethereum
eth_ticker = “ETH-USD”

# Define the date range for historical data
start_date = “2020-01-01”
end_date = “2022-01-01”

# Download historical data using yfinance
eth_data = yf.download(eth_ticker, start=start_date, end=end_date)

# Create a Pandas DataFrame from the downloaded data
eth_df = pd.DataFrame(eth_data)

# Define the Excel file path
excel_file_path = “ethereum_historical_data.xlsx”

# Save the data to an Excel sheet
eth_df.to_excel(excel_file_path, sheet_name=”ETH Historical Data”)

print(f”Data saved to {excel_file_path}”)

# Make sure you have the required libraries installed and adjust the “start_date” and “end_date” variables to the desired date range for the historical data you want to download.

Evolution of the Ethereum

Figure 6 below gives the evolution of the Ethereum on a daily basis.

Source: computation by the author (data: Yahoo! Finance website).

Figure 6. Evolution of the Ethereum.

Source: computation by the author (data: Yahoo! Finance website).

Figure 2 below gives the evolution of the Ethereum returns from November 09, 2017 to December 31, 2022 on a daily basis.

Figure 7. Evolution of the Ethereum returns.

Source: computation by the author (data: Yahoo! Finance website).

Summary statistics for the Ethereum

The R program that you can download above also allows you to compute summary statistics about the returns of the Ethereum.

Table 4 below presents the following summary statistics estimated for the Ethereum:

  • The mean
  • The standard deviation (the squared root of the variance)
  • The skewness
  • The kurtosis.

The mean, the standard deviation / variance, the skewness, and the kurtosis refer to the first, second, third and fourth moments of statistical distribution of returns respectively.

Table 4. Summary statistics for the Ethereum.
Summary statistics for the Ethereum
Source: computation by the author (data: Yahoo! Finance website).

Statistical distribution of the Ethereum returns

Historical distribution

Figure 8 represents the historical distribution of the Ethereum daily returns for the period from November 09, 2017 to December 31, 2022.

Figure 8. Historical Ethereum distribution of the returns.

Source: computation by the author (data: Yahoo! Finance website).

Gaussian distribution

The Gaussian distribution (also called the normal distribution) is a parametric distribution with two parameters: the mean and the standard deviation of returns. We estimated these two parameters over the period from November 09, 2017 to December 31, 2022. The annualized mean of daily returns is equal to 30.81% and the annualized standard deviation of daily returns is equal to 62.33%.

Figure 9 below represents the Gaussian distribution of the Ethereum daily returns with parameters estimated over the period from November 09, 2017 to December 31, 2022.

Figure 9. Gaussian distribution of the Ethereum returns.

Source: computation by the author (data: Yahoo! Finance website).

Risk measures of the Ethereum returns

The R program that you can download above also allows you to compute risk measures about the returns of the Ethereum.

Table 5 below presents the following risk measures estimated for the Ethereum:

  • The long-term volatility (the unconditional standard deviation estimated over the entire period)
  • The short-term volatility (the standard deviation estimated over the last three months)
  • The Value at Risk (VaR) for the left tail (the 5% quantile of the historical distribution)
  • The Value at Risk (VaR) for the right tail (the 95% quantile of the historical distribution)
  • The Expected Shortfall (ES) for the left tail (the average loss over the 5% quantile of the historical distribution)
  • The Expected Shortfall (ES) for the right tail (the average loss over the 95% quantile of the historical distribution)
  • The Stress Value (SV) for the left tail (the 1% quantile of the tail distribution estimated with a Generalized Pareto distribution)
  • The Stress Value (SV) for the right tail (the 99% quantile of the tail distribution estimated with a Generalized Pareto distribution)

Table 5. Risk measures for the Ethereum.
Risk measures for the Ethereum
Source: computation by the author (data: Yahoo! Finance website).

The volatility is a global measure of risk as it considers all the returns. The Value at Risk (VaR), Expected Shortfall (ES) and Stress Value (SV) are local measures of risk as they focus on the tails of the distribution. The study of the left tail is relevant for an investor holding a long position in the Ethereum while the study of the right tail is relevant for an investor holding a short position in the Ethereum.

Why should I be interested in this post?

Ethereum, the pioneering blockchain platform, is an essential topic for management students due to its potential to transform industries, create innovative business opportunities, and disrupt traditional financial systems. Understanding Ethereum’s smart contracts, DeFi ecosystem, NFT market, and global impact can provide students with a competitive edge in a rapidly evolving business landscape, enabling them to navigate emerging trends, make informed investment decisions, and explore entrepreneurship in the digital economy.

Related posts on the SimTrade blog

About cryptocurrencies

▶ Snehasish CHINARA Bitcoin: the mother of all cryptocurrencies

▶ Snehasish CHINARA How to get crypto data

▶ Alexandre VERLET Cryptocurrencies

▶ Youssef EL QAMCAOUI Decentralised Financing

▶ Hugo MEYER The regulation of cryptocurrencies: what are we talking about?

About statistics

▶ Shengyu ZHENG Moments de la distribution

▶ Shengyu ZHENG Mesures de risques

▶ Jayati WALIA Returns

Useful resources

Academic research about risk

Longin F. (2000) From VaR to stress testing: the extreme value approach Journal of Banking and Finance, N°24, pp 1097-1130.

Longin F. (2016) Extreme events in finance: a handbook of extreme value theory and its applications Wiley Editions.

Data

Yahoo! Finance

Yahoo! Finance Historical data for Ethereum

CoinMarketCap Historical data for Ethereum

About the author

The article was written in December 2023 by Snehasish CHINARA (ESSEC Business School, Grande Ecole Program – Master in Management, 2022-2024).

Free Cash Flow: A Critical Metric in Finance

Free Cash Flow: A Critical Metric in Finance

Lou PERRONE

In this article, Lou PERRONE (ESSEC Business School, Global BBA, 2019-2023) explains the concept of Free Cash Flow and its importance in financial analysis.

What is Free Cash Flow?

Free Cash Flow (FCF) is a critical financial metric used to determine the amount of cash generated by a business after accounting for capital expenditures. It represents the cash available for distribution among all the securities holders of a company (equity holders and debt holders) and provides insight into a company’s financial health and its ability to pursue opportunities without external financing.

How is Free Cash Flow Calculated?

The general formula for calculating Free Cash Flow:

Free Cash Flow = Operating Cash Flow – Capital Expenditures

Operating Cash Flow refers to the total amount of cash generated by a company’s core operating activities. It reflects the cash generated from the actual business operations of selling goods and services.

Capital Expenditures (CapEx) are funds used by the company to purchase, upgrade, and maintain physical assets. This could include expenditures on property, plant, equipment, and technology.

Example of FCF

The Excel document provided as an exemple is a critical financial tool that enables an in-depth analysis of ABC Limited’s Free Cash Flow (FCF). By meticulously tracking the inflows and outflows of cash, the spreadsheet highlights the company’s capacity to generate cash after covering all its capital expenditures. Observing the ‘Net Cash Flow’ row, we can discern the periods where the company has successfully managed its resources to produce a positive FCF, which indicates surplus cash availability that can be used for debt repayment, reinvestment in the business, dividends to shareholders, or as a reserve for future growth opportunities. Conversely, any negative FCF would warrant a closer investigation into the company’s spending on assets or its operational efficiency. The ability to forecast and analyze FCF is crucial for business sustainability and strategic financial planning, as it provides a clearer picture of financial health beyond simple profitability metrics.

Example of computation of free cash flows.
Example of computation of free cash flow
Source: the author.

Why is Free Cash Flow Significant?

FCF is an important indicator of a company’s financial strength. It shows how efficient a company is at generating cash and is often used by analysts and investors to assess whether a company has the financial flexibility to invest in its business, pay down debt, return money to shareholders, or weather economic downturns.

Factors Impacting Free Cash Flow

While many elements can impact the calculation of FCF, some key influencers include:

  • Revenue Growth: Increases in sales can lead to higher operating cash flows.
  • Operating Margins: Efficiency in managing operational costs can lead to better margins and in turn affect FCF positively.
  • Capital Efficiency: Companies that manage their capital expenses efficiently can have higher free cash flows, as they spend less on fixed assets relative to the cash they generate.

The Dual Nature of FCF

A consistently positive FCF indicates a company’s ability to generate surplus cash after meeting all its operational and capital requirements. On the other hand, consistently negative FCF might suggest that the company is investing heavily for future growth or struggling to generate enough cash.

Interpreting Free Cash Flow

It’s crucial to contextualize FCF within the industry and the specific company’s growth stage. High-growth firms might have lower FCF due to heavy investments, while mature companies might generate more consistent free cash flows.

Why should I be interested in this post?

Understanding Free Cash Flow is indispensable for management students. It not only measures a company’s profitability but also its liquidity, solvency, and the overall health of its business model. By assessing FCF in conjunction with other financial metrics, students can gain a comprehensive view of a company’s financial health, aiding them in making informed investment or management decisions. Whether it’s for investment appraisal or corporate financial analysis, understanding the nuances of FCF is fundamental for anyone in the realm of finance and business.

Related posts on the SimTrade blog

   ▶ All posts about Financial Techniques

Useful resources

Damodaran A. (2012) Investment Valuation: Tools and Techniques for Determining the Value of Any Asset 3rd ed. New York: Wiley.

Brealey R.A., Myers S.C., and Allen F. (2011) Principles of Corporate Finance 11th ed. New York: McGraw-Hill/Irwin.

Ross S.A., Westerfield R.W., and Jaffe J. (2016) Corporate Finance 11th ed. New York: McGraw-Hill/Irwin.

About the author

The article was written in December 2023 by Lou PERRONE (ESSEC Business School, Global BBA, 2019-2023).

Extreme returns and tail modelling of the CSI 300 index for the Chinese equity market

Extreme returns and tail modelling of the CSI 300 index for the Chinese equity market

Shengyu ZHENG

In this article, Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024) describes the statistical behavior of extreme returns of the CSI 300 index for the Chinese equity market and explains how extreme value theory can be used to model the tails of its distribution.

The CSI 300 index for the Chinese equity market

The CSI 300 Index, or China Securities Index 300, is a comprehensive stock market benchmark that tracks the performance of the top 300 A-share stocks listed on the Shanghai and Shenzhen stock exchanges. Introduced in 2005, the index is designed to represent a broad and diverse spectrum of China’s leading companies across various sectors, including finance, technology, consumer goods, and manufacturing. The CSI 300 is a crucial indicator of the overall health and direction of the Chinese stock market, reflecting the dynamic growth and evolution of China’s economy.

The CSI 300 employs a free-float market capitalization-weighted methodology. This means that the index’s composition and movements are influenced by the market value of the freely tradable shares, providing a more accurate representation of the companies’ actual impact on the market. As China continues to play a significant role in the global economy, the CSI 300 has become a key reference point for investors seeking exposure to the Chinese market and monitoring economic trends in the dynamic economy. With its emphasis on the country’s most influential and traded stocks, the CSI 300 serves as an essential tool for both domestic and international investors navigating the complexities of the Chinese financial landscape.

In this article, we focus on the CSI 300 index of the timeframe from March 11th, 2021, to April 1st, 2023. Here we have a line chart depicting the evolution of the index level of this period.

Figure 1 below gives the evolution of the CSI 300 index from March 11th, 2021, to April 1st, 2023 on a daily basis.

Figure 1. Evolution of the CSI 300 index.
Evolution of the CSI 300 index
Source: computation by the author (data: Yahoo! Finance website).

Figure 2 below gives the evolution of the logarithmic returns of CSI 300 index from March 11th, 2021, to April 1st, 2023 on a daily basis. We observe concentration of volatility reflecting large price fluctuations in both directions (up and down movements). This alternation of periods of low and high volatility is well modeled by ARCH models.

Figure 2. Evolution of the CSI 300 index logarithmic returns.
Evolution of the CSI 300 index return
Source: computation by the author (data: Yahoo! Finance website).

Summary statistics for the CSI 300 index

Table 1 below presents the summary statistics estimated for the CSI 300 index:

Table 1. Summary statistics for the CSI 300 index.
summary statistics of the CSI 300 index returns
Source: computation by the author (data: Yahoo! Finance website).

The mean, the standard deviation / variance, the skewness, and the kurtosis refer to the first, second, third and fourth moments of statistical distribution of returns respectively. We can conclude that during this timeframe, the CSI 300 index takes on a downward trend, with relatively important daily deviation, negative skewness and excess of kurtosis.

Tables 2 and 3 below present the top 10 negative daily returns and top 10 positive daily returns for the index over the period from March 11th, 2021, to April 1st, 2023.

Table 2. Top 10 negative daily returns for the CSI 300 index.
Top 10 negative returns of the CSI 300 index
Source: computation by the author (data: Yahoo! Finance website).

Table 3. Top 10 positive daily returns for the CSI 300 index.
Top 10 positive returns of the CSI 300 index
Source: computation by the author (data: Yahoo! Finance website).

Modelling of the tails

Here the tail modelling is conducted based on the Peak-over-Threshold (POT) approach which corresponds to a Generalized Pareto Distribution (GPD). Let us recall the theoretical background of this approach.

The POT approach takes into account all data entries above a designated high threshold u. The threshold exceedances could be fitted into a generalized Pareto distribution:

 Illustration of the POT approach

An important issue for the POT-GPD approach is the threshold selection. An optimal threshold level can be derived by calibrating the tradeoff between bias and inefficiency. There exist several approaches to address this problematic, including a Monte Carlo simulation method inspired by the work of Jansen and de Vries (1991). In this article, to fit the GPD, we use the 2.5% quantile for the modelling of the negative tail and the 97.5% quantile for that of the positive tail.

Based on the POT-GPD approach with a fixed threshold selection, we arrive at the following modelling results for the GPD for negative extreme returns (Table 4) and positive extreme returns (Table 5) for the CSI 300 index:

Table 4. Estimate of the parameters of the GPD for negative daily returns for the CSI 300 index.
Modelling of negative extreme returns of the CSI 300 index
Source: computation by the author (data: Yahoo! Finance website).

Table 5. Estimate of the parameters of the GPD for positive daily returns for the CSI 300 index.
Modelling of positive extreme returns of the CSI 300 index
Source: computation by the author (data: Yahoo! Finance website).

Figure 3 represents the historical distribution of negative return exceedances and the estimated GPD for the left tail.

Figure 3. GPD for the left tail of the CSI 300 index returns.
GPD for the left tail of the CSI 300 index returns
Source: computation by the author (data: Yahoo! Finance website).

Figures 4 represents the historical distribution of positive return exceedances and the estimated GPD for the right tail.

Figure 4. GPD for the right tail of the CSI 300 index returns.
GPD for the right tail of the CSI 300 index returns
Source: computation by the author (data: Yahoo! Finance website).

Applications in risk management

Extreme Value Theory (EVT) as a statistical approach is used to analyze the tails of a distribution, focusing on extreme events or rare occurrences. EVT can be applied to various risk management techniques, including Value at Risk (VaR), Expected Shortfall (ES), and stress testing, to provide a more comprehensive understanding of extreme risks in financial markets.

Why should I be interested in this post?

Extreme Value Theory is a useful tool to model the tails of the evolution of a financial instrument. In the ever-evolving landscape of financial markets, being able to grasp the concept of EVT presents a unique edge to students who aspire to become an investment or risk manager. It not only provides a deeper insight into the dynamics of equity markets but also equips them with a practical skill set essential for risk analysis. By exploring how EVT refines risk measures like Value at Risk (VaR) and Expected Shortfall (ES) and its role in stress testing, students gain a valuable perspective on how financial institutions navigate during extreme events. In a world where financial crises and market volatility are recurrent, this post opens the door to a powerful analytical framework that contributes to informed decisions and financial stability.

Download R file to model extreme behavior of the index

You can find below an R file (file with txt format) to study extreme returns and model the distribution tails for the CSI 300 index.

Download R file to study extreme returns and model the distribution tails for the CSI 300 index

Related posts on the SimTrade blog

About financial indexes

▶ Nithisha CHALLA Financial indexes

▶ Nithisha CHALLA Calculation of financial indexes

▶ Nithisha CHALLA The CSI 300 index

About portfolio management

▶ Youssef LOURAOUI Portfolio

▶ Jayati WALIA Returns

About statistics

▶ Shengyu ZHENG Moments de la distribution

▶ Shengyu ZHENG Mesures de risques

▶ Shengyu ZHENG Extreme Value Theory: the Block-Maxima approach and the Peak-Over-Threshold approach

▶ Gabriel FILJA Application de la théorie des valeurs extrêmes en finance de marchés

Useful resources

Academic resources

Embrechts P., C. Klüppelberg and T. Mikosch (1997) Modelling Extremal Events for Insurance and Finance Springer-Verlag.

Embrechts P., R. Frey, McNeil A.J. (2022) Quantitative Risk Management Princeton University Press.

Gumbel, E. J. (1958) Statistics of extremes New York: Columbia University Press.

Longin F. (2016) Extreme events in finance: a handbook of extreme value theory and its applications Wiley Editions.

Other resources

Extreme Events in Finance

Chan S. Statistical tools for extreme value analysis

Rieder H. E. (2014) Extreme Value Theory: A primer (slides).

About the author

The article was written in November 2023 by Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024).

Extreme returns and tail modelling of the Nikkei 225 index for the Japanese equity market

Extreme returns and tail modelling of the Nikkei 225 index for the Japanese equity market

Shengyu ZHENG

In this article, Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024) describes the statistical behavior of extreme returns of the Nikkei 225 index for the Japanese equity market and explains how extreme value theory can be used to model the tails of its distribution.

The Nikkei 225 index for the Japanese equity market

The Nikkei 225, often simply referred to as the Nikkei, is a stock market index representing the performance of 225 major companies listed on the Tokyo Stock Exchange (TSE). Originating in 1950, this index has become a symbol of Japan’s economic prowess and serves as a crucial benchmark in the Asian financial markets. Comprising companies across diverse sectors such as technology, automotive, finance, and manufacturing, the Nikkei 225 offers a comprehensive snapshot of the Japanese economic landscape, reflecting the nation’s technological innovation, industrial strength, and global economic influence.

Utilizing a price-weighted methodology, the Nikkei 225 calculates its value based on stock prices rather than market capitalization, distinguishing it from many other indices. This approach means that higher-priced stocks have a more significant impact on the index’s movements. Investors and financial analysts worldwide closely monitor the Nikkei 225 for insights into Japan’s economic trends, market sentiment, and investment opportunities. As a vital indicator of the direction of the Japanese stock market, the Nikkei 225 continues to be a key reference point for making informed investment decisions and navigating the complexities of the global financial landscape.

In this article, we focus on the Nikkei 225 index of the timeframe from April 1st, 2015, to April 1st, 2023. Here we have a line chart depicting the evolution of the index level of this period.

Figure 1 below gives the evolution of the Nikkei 225 index from April 1, 2015 to April 1, 2023 on a daily basis.

Figure 1. Evolution of the Nikkei 225 index.
Evolution of the Nikkei 225 index
Source: computation by the author (data: Yahoo! Finance website).

Figure 2 below gives the evolution of the daily logarithmic returns of Nikkei 225 index from April 1, 2015 to April 1, 2023 on a daily basis. We observe concentration of volatility reflecting large price fluctuations in both directions (up and down movements). This alternation of periods of low and high volatility is well modeled by ARCH models.

Figure 2. Evolution of the Nikkei 225 index logarithmic returns.
Evolution of the Nikkei 225 index return
Source: computation by the author (data: Yahoo! Finance website).

Summary statistics for the Nikkei index

Table 1 below presents the summary statistics estimated for the Nikkei 225 index:

Table 1. Summary statistics for the Nikkei 225 index.
summary statistics of the Nikkei 225 index returns
Source: computation by the author (data: Yahoo! Finance website).

The mean, the standard deviation / variance, the skewness, and the kurtosis refer to the first, second, third and fourth moments of statistical distribution of returns respectively. We can conclude that during this timeframe, the Nikkei 225 index takes on a slight upward trend, with relatively important daily deviation, negative skewness and excess of kurtosis.

Tables 2 and 3 below present the top 10 negative daily returns and top 10 positive daily returns for the index over the period from April 1, 2015 to April 1, 2023.

Table 2. Top 10 negative daily returns for the Nikkei 225 index.
Top 10 negative returns of the Nikkei 225 index
Source: computation by the author (data: Yahoo! Finance website).

Table 3. Top 10 positive daily returns for the Nikkei 225 index.
Top 10 positive returns of the Nikkei 225 index
Source: computation by the author (data: Yahoo! Finance website).

Modelling of the tails

Here the tail modelling is conducted based on the Peak-over-Threshold (POT) approach which corresponds to a Generalized Pareto Distribution (GPD). Let’s recall the theoretical background of this approach.

The POT approach takes into account all data entries above a designated high threshold u. The threshold exceedances could be fitted into a generalized Pareto distribution:

 Illustration of the POT approach

An important issue for the POT-GPD approach is the threshold selection. An optimal threshold level can be derived by calibrating the tradeoff between bias and inefficiency. There exist several approaches to address this problematic, including a Monte Carlo simulation method inspired by the work of Jansen and de Vries (1991). In this article, to fit the GPD, we use the 2.5% quantile for the modelling of the negative tail and the 97.5% quantile for that of the positive tail.

Based on the POT-GPD approach with a fixed threshold selection, we arrive at the following modelling results for the GPD for negative extreme returns (Table 4) and positive extreme returns (Table 5) for the Nikkei 225 index:

Table 4. Estimate of the parameters of the GPD for negative daily returns for the Nikkei 225 index.
Modelling of negative extreme returns of the Nikkei 225 index
Source: computation by the author (data: Yahoo! Finance website).

Table 5. Estimate of the parameters of the GPD for positive daily returns for the Nikkei 225 index.
Modelling of positive extreme returns of the Nikkei 225 index
Source: computation by the author (data: Yahoo! Finance website).

Figure 3. GPD for the left tail of the Nikkei 225 index returns.
GPD for the left tail of the Nikkei 225 index returns
Source: computation by the author (data: Yahoo! Finance website).

Figure 4. GPD for the right tail of the Nikkei 225 index returns.
GPD for the right tail of the Nikkei 225 index returns
Source: computation by the author (data: Yahoo! Finance website).

Applications in risk management

Extreme Value Theory (EVT) as a statistical approach is used to analyze the tails of a distribution, focusing on extreme events or rare occurrences. EVT can be applied to various risk management techniques, including Value at Risk (VaR), Expected Shortfall (ES), and stress testing, to provide a more comprehensive understanding of extreme risks in financial markets.

Why should I be interested in this post?

Extreme Value Theory is a useful tool to model the tails of the evolution of a financial instrument. In the ever-evolving landscape of financial markets, being able to grasp the concept of EVT presents a unique edge to students who aspire to become an investment or risk manager. It not only provides a deeper insight into the dynamics of equity markets but also equips them with a practical skill set essential for risk analysis. By exploring how EVT refines risk measures like Value at Risk (VaR) and Expected Shortfall (ES) and its role in stress testing, students gain a valuable perspective on how financial institutions navigate during extreme events. In a world where financial crises and market volatility are recurrent, this post opens the door to a powerful analytical framework that contributes to informed decisions and financial stability.

Download R file to model extreme behavior of the index

You can find below an R file (file with txt format) to study extreme returns and model the distribution tails for the Nikkei 225 index.

Download R file to study extreme returns and model the distribution tails for the Nikkei 225 index

Related posts on the SimTrade blog

About financial indexes

   ▶ Nithisha CHALLA Financial indexes

   ▶ Nithisha CHALLA Calculation of financial indexes

   ▶ Nithisha CHALLA The Nikkei 225 index

About portfolio management

   ▶ Youssef LOURAOUI Portfolio

   ▶ Jayati WALIA Returns

About statistics

   ▶ Shengyu ZHENG Moments de la distribution

   ▶ Shengyu ZHENG Mesures de risques

   ▶ Shengyu ZHENG Extreme Value Theory: the Block-Maxima approach and the Peak-Over-Threshold approach

   ▶ Gabriel FILJA Application de la théorie des valeurs extrêmes en finance de marchés

Useful resources

Academic resources

Embrechts P., C. Klüppelberg and T. Mikosch (1997) Modelling Extremal Events for Insurance and Finance Springer-Verlag.

Embrechts P., R. Frey, McNeil A.J. (2022) Quantitative Risk Management Princeton University Press.

Gumbel, E. J. (1958) Statistics of extremes New York: Columbia University Press.

Longin F. (2016) Extreme events in finance: a handbook of extreme value theory and its applications Wiley Editions.

Other resources

Extreme Events in Finance

Chan S. Statistical tools for extreme value analysis

Rieder H. E. (2014) Extreme Value Theory: A primer (slides).

About the author

The article was written in November 2023 by Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024).

Extreme returns and tail modelling of the FTSE 100 index for the UK equity market

Extreme returns and tail modelling of the FTSE 100 index for the UK equity market

Shengyu ZHENG

In this article, Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024) describes the statistical behavior of extreme returns of the FTSE 100 index for the UK equity market and explains how extreme value theory can be used to model the tails of its distribution.

The FTSE 100 index for the UK equity market

The FTSE 100 index, an acronym for the Financial Times Stock Exchange 100 Index, stands as a cornerstone of the UK financial landscape. Comprising the largest and most robust companies listed on the London Stock Exchange (LSE), this index is a barometer for the overall health and trajectory of the British stock market. Spanning diverse sectors such as finance, energy, healthcare, and consumer goods, the FTSE 100 encapsulates the economic pulse of the nation. The 100 companies in the index are chosen based on their market capitalization, with larger entities carrying more weight in the index’s calculation, making it a valuable tool for investors seeking a comprehensive snapshot of the UK’s economic performance.

Investors and analysts globally turn to the FTSE 100 for insights into market trends and economic stability in the UK. The index’s movements provide a useful reference point for decision-making, enabling investors to gauge the relative strength and weaknesses of different industries and the economy at large. Moreover, the FTSE 100 serves as a powerful benchmark for numerous financial instruments, including mutual funds, exchange-traded funds (ETFs), and other investment products. As a result, the index plays a pivotal role in shaping investment strategies and fostering a deeper understanding of the intricate dynamics that drive the British financial markets.

In this article, we focus on the FTSE 100 index of the timeframe from April 1st, 2015, to April 1st, 2023. Here we have a line chart depicting the evolution of the index level of this period.

Figure 1 below gives the evolution of the FTSE 100 index from April 1, 2015 to April 1, 2023 on a daily basis.

Figure 1. Evolution of the FTSE 100 index.
Evolution of the FTSE 100 index
Source: computation by the author (data: Yahoo! Finance website).

Figure 2 below gives the evolution of the daily logarithmic returns of FTSE 100 index from April 1, 2015 to April 1, 2023. We observe concentration of volatility reflecting large price fluctuations in both directions (up and down movements). This alternation of periods of low and high volatility is well modeled by ARCH models.

Figure 2. Evolution of the FTSE 100 index returns.
Evolution of the FTSE 100 index returns
Source: computation by the author (data: Yahoo! Finance website).

Summary statistics for the FTSE 100 index

Table 1 below presents the summary statistics estimated for the FTSE 100 index:

Table 1. Summary statistics for the FTSE 100 index returns.
Summary statistics of the FTSE 100 index returns
Source: computation by the author (data: Yahoo! Finance website).

The mean, the standard deviation / variance, the skewness, and the kurtosis refer to the first, second, third and fourth moments of statistical distribution of returns respectively. We can conclude that during this timeframe, the FTSE 100 index takes on a slight upward trend, with relatively important daily deviation, negative skewness and excess of kurtosis.

Tables 2 and 3 below present the top 10 negative daily returns and top 10 positive daily returns for the index over the period from April 1, 2015 to April 1, 2023.

Table 2. Top 10 negative daily returns for the FTSE 100 index.
Top 10 negative returns of the FTSE 100 index
Source: computation by the author (data: Yahoo! Finance website).

Table 3. Top 10 positive daily returns for the FTSE 100 index.
Top 10 positive returns of the FTSE 100 index
Source: computation by the author (data: Yahoo! Finance website).

Modelling of the tails

Here the tail modelling is conducted based on the Peak-over-Threshold (POT) approach which corresponds to a Generalized Pareto Distribution (GPD). Let’s recall the theoretical background of this approach.

The POT approach takes into account all data entries above a designated high threshold u. The threshold exceedances could be fitted into a generalized Pareto distribution:

 Illustration of the POT approach

An important issue for the POT-GPD approach is the threshold selection. An optimal threshold level can be derived by calibrating the tradeoff between bias and inefficiency. There exist several approaches to address this problematic, including a Monte Carlo simulation method inspired by the work of Jansen and de Vries (1991). In this article, to fit the GPD, we use the 2.5% quantile for the modelling of the negative tail and the 97.5% quantile for that of the positive tail.

Based on the POT-GPD approach with a fixed threshold selection, we arrive at the following modelling results for the GPD for negative extreme returns (Table 4) and positive extreme returns (Table 5) for the FTSE 100 index:

Table 4. Estimate of the parameters of the GPD for negative daily returns for the FTSE 100 index.
Estimate of the parameters of the GPD for negative daily returns for the FTSE 100 index
Source: computation by the author (data: Yahoo! Finance website).

Table 5. Estimate of the parameters of the GPD for positive daily returns for the FTSE 100 index.
Estimate of the parameters of the GPD for positive daily returns for the FTSE 100 index
Source: computation by the author (data: Yahoo! Finance website).

Figure 3. GPD for the left tail of the FTSE 100 index returns.
GPD for the left tail of the FTSE 100 index returns
Source: computation by the author (data: Yahoo! Finance website).

Figure 4. GPD for the right tail of the FTSE 100 index returns.
GPD for the right tail of the FTSE 100 index returns
Source: computation by the author (data: Yahoo! Finance website).

Applications in risk management

Extreme Value Theory (EVT) as a statistical approach is used to analyze the tails of a distribution, focusing on extreme events or rare occurrences. EVT can be applied to various risk management techniques, including Value at Risk (VaR), Expected Shortfall (ES), and stress testing, to provide a more comprehensive understanding of extreme risks in financial markets.

Why should I be interested in this post?

Extreme Value Theory is a useful tool to model the tails of the evolution of a financial instrument. In the ever-evolving landscape of financial markets, being able to grasp the concept of EVT presents a unique edge to students who aspire to become an investment or risk manager. It not only provides a deeper insight into the dynamics of equity markets but also equips them with a practical skill set essential for risk analysis. By exploring how EVT refines risk measures like Value at Risk (VaR) and Expected Shortfall (ES) and its role in stress testing, students gain a valuable perspective on how financial institutions navigate during extreme events. In a world where financial crises and market volatility are recurrent, this post opens the door to a powerful analytical framework that contributes to informed decisions and financial stability.

Download R file to model extreme behavior of the index

You can find below an R file (file with txt format) to study extreme returns and model the distribution tails for the FTSE 100 index.

Download R file to study extreme returns and model the distribution tails for the FTSE 100 index

Related posts on the SimTrade blog

About financial indexes

   ▶ Nithisha CHALLA Financial indexes

   ▶ Nithisha CHALLA Calculation of financial indexes

   ▶ Nithisha CHALLA The FTSE 100 index

About portfolio management

   ▶ Youssef LOURAOUI Portfolio

   ▶ Jayati WALIA Returns

About statistics

   ▶ Shengyu ZHENG Moments de la distribution

   ▶ Shengyu ZHENG Mesures de risques

   ▶ Shengyu ZHENG Extreme Value Theory: the Block-Maxima approach and the Peak-Over-Threshold approach

   ▶ Gabriel FILJA Application de la théorie des valeurs extrêmes en finance de marchés

Useful resources

Academic resources

Embrechts P., C. Klüppelberg and T. Mikosch (1997) Modelling Extremal Events for Insurance and Finance Springer-Verlag.

Embrechts P., R. Frey, McNeil A.J. (2022) Quantitative Risk Management Princeton University Press.

Gumbel, E. J. (1958) Statistics of extremes New York: Columbia University Press.

Longin F. (2016) Extreme events in finance: a handbook of extreme value theory and its applications Wiley Editions.

Other resources

Extreme Events in Finance

Chan S. Statistical tools for extreme value analysis

Rieder H. E. (2014) Extreme Value Theory: A primer (slides).

About the author

The article was written in November 2023 by Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024).

Copula

Copula

Shengyu ZHENG

In this article, Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024) presents copula, a statistical tool that is commonly used to model dependency of random variables.

Linear correlation

In the world stacked with various risks, a simplistic look of individual risks does not suffice, since the interactions between risks could add to or diminish the aggregate risk loading. As we often see in statistical modelling, linear correlation, as one of the simplest ways to look at dependency between random variables, is commonly used for this purpose.

Definition of linear correlation

To put it concisely, the linear correlation coefficient, denoted by ‘ρ(X,Y)’, takes values within the range of -1 to 1 and represents the linear correlation of two random variables X and Y. A positive ‘ρ(X,Y)’ indicates a positive linear relationship, signifying that as one variable increases, the other tends to increase as well. Conversely, a negative ‘ρ(X,Y)’ denotes a negative linear relationship, signifying that as one variable increases, the other tends to decrease. A correlation coefficient near zero implies a lack of linear relation.

Limitation of linear correlation

As a simplistic model, while having the advantage of easy application, linear correlation fails to capture the intricacy of the dependance structure between random variables. There exist three main limitations of linear correlation.

  • ρ(X,Y) only gives a scalar summary of linear dependence and it requires that both var(X) and var(Y) must exist and finite;
  • Given that assumption that X and Y are stochastically independent, it can be inferred that ρ(X,Y) = 0. Whereas, the converse does not stand for most of the cases (except if (X,Y) is a Gaussian random vector).
  • Linear correlation is not invariant with regard to strict increasing transformations. If T is such a transformation, ρ(T(X),T(Y)) ≠ ρ(X,Y)

Therefore, if we have in hand the marginal distributions of two random variables and their linear correlations, it does not suffice to determine the joint distribution.

Copula

A copula is a mathematical function that describes the dependence structure between multiple random variables, irrespective of their marginal distributions. It describes the interdependency that transcends linear relationships. Copulas are employed to model the joint distribution of variables by separating the marginal distributions from the dependence structure, allowing for a more flexible and comprehensive analysis of multivariate data. Essentially, copulas serve as a bridge between the individual distributions of variables and their joint distribution, enabling the characterization of their interdependence.

Definition of copula

A copula, denoted typically as C∶[0,1]d→[0,1] , is a multivariate distribution function whose marginals are uniformly distributed on the unit interval. The parameter d is the number of variables. For a set of random variables U1, …, Ud with cumulative distribution functions F1, …, Fd, the copula function C satisfies:

C(F1(u1),…,Fd(ud)) = ℙ(U1≤u1,…,Ud≤ud)

Fréchet-Hoeffding bounds

The Fréchet–Hoeffding theorem states that copulas follow the bounds:

max{1 – d + ∑di=1ui} ≤ C(u) ≤ min{u1, …, ud}

In a bivariate case (dimension equals 2), the Fréchet–Hoeffding bounds are

max{u+v-1,0} ≤ C(u,v) ≤ min{u,v}

The upper bound corresponds to the case of comonotonicity (perfect positive dependence) and the lower bound corresponds to the case of countermonotonicity (perfect negative dependence).

Sklar’s theorem

Sklar’s theorem states that every multivariate cumulative distribution function of a random vector X can be expressed in terms of its marginals and a copula. The copula is unique if the marginal distributions are continuous. The theorem states also that the converse is true.

Sklar’s theorem shows how a unique copula C fully describes the dependence of X. The theorem provides a way to decompose a multivariate joint distribution function into its marginal distributions and a copula function.

Examples of copulas

Many types of dependence structures exist, and new copulas are being introduced by researchers. There are three standard classes of copulas that are commonly in use among practitioners: elliptical or normal copulas, Archimedean copulas, and extreme value copulas.

Elliptical or normal copulas

The Gaussian copula and the Student-t copula are among this category. Be reminded that the Gaussian copula played a notable role in the 2008 financial crisis, particularly in the context of mortgage-backed securities and collateralized debt obligations (CDOs). The assumption of normality and underestimation of systemic risk based on the Gaussian copula failed to account for the extreme risks in face of crisis.

Here is an example of a simulated normal copula with the parameter being 0.8.

Figure 1. Simulation of normal copula.
Simulation of normal copula
Source: computation by the author.

Archimedean copulas

Archimedean copulas are a class of copulas that have a particular mathematical structure based on Archimedean copula families. These copulas have a connection with certain mathematical functions known as Archimedean generators.

Here is an example of a simulated Clayton copula with the parameter being 3, which is from the category of Archimedean copulas

Figure 2. Simulation of Clayton copula.
Simulation of Clayton copula
Source: computation by the author.

Extreme value copulas

Extreme value copulas could overlap with the two other classes. They are a specialized class of copulas designed to model the tail dependence structure of multivariate extreme events. These copulas are particularly useful in situations where the focus is on capturing dependencies in the extreme upper or lower tails of the distribution.

Here is an example of a simulated Tawn copula with the parameter being 0.8, which is from the category of extreme value copulas

Figure 3. Simulation of Tawn copula.
Simulation of Clayton copula
Source: computation by the author.

Download R file to simulate copulas

You can find below an R file (file with txt format) to simulate the 3 copulas mentioned above.

Download R file to simulate copulas

Why should I be interested in this post?

Copulas are pivotal in risk management, offering a sophisticated approach to model the dependence among various risk factors. They play a crucial role in portfolio risk assessment, providing insights into how different assets behave together and enhancing the robustness of risk measures, especially in capturing tail dependencies. Copulas are also valuable in credit risk management, aiding in the assessment of joint default probabilities and contributing to an understanding of credit risks associated with diverse financial instruments. Their applications extend to insurance, operational risk management, and stress testing scenarios, providing a toolset for comprehensive risk evaluation and informed decision-making in dynamic financial environments.

Related posts on the SimTrade blog

▶ Shengyu ZHENG Moments de la distribution

▶ Shengyu ZHENG Mesures de risques

▶ Shengyu ZHENG Extreme Value Theory: the Block-Maxima approach and the Peak-Over-Threshold approach

▶ Gabriel FILJA Application de la théorie des valeurs extrêmes en finance de marchés

Useful resources

Course notes from Quantitative Risk Management of Prof. Marie Kratz, ESSEC Business School.

About the author

The article was written in November 2023 by Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024).

Bitcoin: the mother of all cryptocurrencies

Bitcoin: the mother of all cryptocurrencies

 Snehasish CHINARA

In this article, Snehasish CHINARA (ESSEC Business School, Grande Ecole Program – Master in Management, 2022-2024) explains Bitcoin which is considered as the mother of all cryptocurrencies.

Historical context and background

The genesis of Bitcoin can be traced back to the aftermath of the Financial Crisis of 2008, when a growing desire emerged for a currency immune to central authority control. Traditional banks had faltered, leading to the devaluation of money through government-sanctioned printing. The absence of a definitive limit on money creation fostered uncertainty. Bitcoin ingeniously addressed this quandary by establishing a fixed supply of coins and a controlled production rate through transparent coding. This code’s openness ensured that no entity, including governments, could manipulate the currency’s value. Consequently, Bitcoin’s worth became solely determined by market dynamics, evading the arbitrary alterations typical of government-managed currencies.

Furthermore, Bitcoin revolutionized financial transactions by eliminating reliance on third-party intermediaries, exemplified by banks. Users can now engage in direct peer-to-peer transactions, circumventing the potential for intermediaries to engage in risky financial ventures akin to the 2008 Financial Crisis. The process of safeguarding one’s Bitcoins is equally innovative, as users manage their funds through a Bitcoin Wallet. Unlike traditional banks, these wallets operate as personal assets, with users as their own bankers. While various companies offer wallet services, the underlying code remains accessible for review, ensuring customers’ trust and the safety of their deposits.

Bitcoin Logo
Bitcoin Logo
Source: internet.

Figure 1. Key Dates in Bitcoin History
Key Dates in Bitcoin History
Source: author of this post.

Key features and use cases

Examples of areas where Bitcoin is currently being used:

  • Digital Currency: Bitcoin serves as a digital currency for everyday transactions, allowing users to buy goods and services online and in physical stores.
  • Crypto Banking: Bitcoin is used in decentralized finance (DeFi) applications, where users can lend, borrow, and earn interest on their Bitcoin holdings.
  • Asset Tokenization: Bitcoin is used to tokenize real-world assets like real estate and art, making them more accessible and divisible among investors.
  • Onchain Governance: Some blockchain projects utilize Bitcoin for on-chain governance, enabling token holders to vote on protocol upgrades and changes.
  • Smart Contracts: While Ethereum is more widely associated with smart contracts, Bitcoin’s second layer solutions like RSK (Rootstock) allow for the execution of smart contracts on the Bitcoin blockchain.
  • Corporate Treasuries: Large corporations, such as Tesla, have invested in Bitcoin as a store of value and an asset to diversify their corporate treasuries.
  • State Treasuries: Some countries, like El Salvador, have adopted Bitcoin as legal tender and added it to their national treasuries to facilitate cross-border remittances and financial inclusion.
  • Store of Value During Times of Conflict: In regions with economic instability or conflict, Bitcoin is used as a hedge against currency devaluation and asset confiscation.
  • Online Gambling: Bitcoin is widely accepted in online gambling platforms, providing users with a secure and pseudonymous way to wager on games and sports.
  • Salary Payments for Freelancers in Emerging Markets: Freelancers in countries with limited access to traditional banking use Bitcoin to receive payments from international clients, circumventing costly and slow remittance services.
  • Cross-Border Transactions with Bitcoin Gold: Cross-border transactions can often be complex, time-consuming, and costly due to the involvement of multiple intermediaries and the varying regulations of different countries. However, Bitcoin Gold offers a streamlined solution for facilitating global payments, making cross-border transactions more efficient and accessible.

These examples highlight the diverse utility of Bitcoin, ranging from everyday transactions to more complex financial applications and as a tool for economic empowerment in various contexts.

Technology and underlying blockchain

Blockchain technology is the foundational innovation that underpins Bitcoin, the world’s first and most well-known cryptocurrency. At its core, blockchain is a decentralized and distributed ledger system that records transactions across a network of computers in a secure and transparent manner. In the context of Bitcoin, this blockchain serves as a public ledger that tracks every transaction ever made with the cryptocurrency. What sets blockchain apart is its ability to ensure trust and security without the need for a central authority, such as a bank or government. Each block in the chain contains a set of transactions, and these blocks are linked together in a chronological and immutable fashion. This means that once a transaction is recorded on the blockchain, it cannot be altered or deleted. This transparency, immutability, and decentralization make blockchain technology a revolutionary tool not only for digital currencies like Bitcoin but also for a wide range of applications in various industries, from finance and supply chain management to healthcare and beyond.

Moreover, Bitcoin operates on a decentralized network of computers (nodes) worldwide. These nodes validate and confirm transactions, ensuring that the network remains secure, censorship-resistant, and immune to central control. The absence of a central authority is a fundamental characteristic of Bitcoin and a key differentiator from traditional financial systems. Bitcoin relies on a PoW consensus mechanism for securing its network. Miners compete to solve complex mathematical puzzles, and the first one to solve it gets the right to add a new block of transactions to the blockchain. This process ensures the security of the network, prevents double-spending, and maintains the integrity of the ledger. Bitcoin has a fixed supply of 21 million coins, a feature hard-coded into its protocol. The rate at which new Bitcoins are created is reduced by half approximately every four years through a process known as a “halving.” This limited supply is in stark contrast to fiat currencies, which can be printed without restriction.

These technological aspects collectively make Bitcoin a groundbreaking innovation that has disrupted traditional finance and is increasingly studied and integrated into the field of finance. It offers unique opportunities and challenges for finance students to explore, including its impact on monetary policy, investment, and the broader financial ecosystem.

Supply of coins

Looking at the supply side of bitcoins, the number of bitcoins in circulation is given by the following mathematical formula:

Formula for the number of bitcoins in circulation

This calculation hinges upon the fundamental concept of the Bitcoin supply schedule, which employs a diminishing issuance rate through a process known as “halving”.

Figure 2 represents the evolution of the number of bitcoins in circulation overt time based on the above formula.

Figure 2. Number of bitcoins in circulation
Number of bitcoins in circulation
Source: computation by the author.

You can download below the Excel file for the data and the figure of the number of bitcoins in circulation.

Download the Excel file with Bitcoin data

Historical data for Bitcoin

How to get the data?

The Bitcoin is the most popular cryptocurrency on the market, and historical data for the Bitcoin such as prices and volume traded can be easily downloaded from the internet sources such as Yahoo! Finance, Blockchain.com & CoinMarketCap. For example, you can download data for Bitcoin on Yahoo! Finance (the Yahoo! code for Bitcoin is BTC-USD).

Figure 4. Bitcoin data
Bitcoin data
Source: Yahoo! Finance.

Historical data for the Bitcoin market prices

The market price of Bitcoin is a dynamic and intricate element that reflects a multitude of factors, both intrinsic and extrinsic. The gradual rise in market value over time indicates a willingness among investors and traders to offer higher prices for the cryptocurrency. This signifies a rising interest and strong belief in the project’s potential for the future. The market price reflects the collective sentiment of investors and traders. Comparing the market price of Bitcoin to other similar cryptocurrencies or benchmark assets can provide insights into its relative strength and performance within the market.

The value of Bitcoin in the market is influenced by a variety of elements, with each factor contributing uniquely to their pricing. One of the most significant influences is market sentiment and investor psychology. These factors can cause prices to shift based on positive news, regulatory changes, or reactive selling due to fear. Furthermore, the real-world implementations and usages of Bitcoin are crucial for its prosperity. Concrete use cases such as Decentralized Finance (DeFi), Non-Fungible Tokens (NFTs), and international transactions play a vital role in creating demand and propelling price appreciation. Meanwhile, adherence to basic economic principles is evident in the supply-demand dynamics, where scarcity due to limited issuance, halving events, and token burns interact with the balance between supply and demand.

With the number of coins in circulation, the information on the price of coins for a given currency is also important to compute Bitcoin’s market capitalization.

Figure 5 below represents the evolution of the price of Bitcoin in US dollar over the period October 2014 – August 2023. The price corresponds to the “closing” price (observed at 10:00 PM CET at the end of the month).

Figure 5. Evolution of the Bitcoin price
Evolution of the Bitcoin price
Source: computation by the author (data source: Yahoo! Finance).

Python code

Python script to download Bitcoin historical data and save it to an Excel sheet::

import yfinance as yf
import pandas as pd

# Define the ticker symbol and date range
ticker_symbol = “BTC-USD”
start_date = “2020-01-01”
end_date = “2023-01-01”

# Download historical data using yfinance
data = yf.download(ticker_symbol, start=start_date, end=end_date)

# Create a Pandas DataFrame
df = pd.DataFrame(data)

# Create a Pandas Excel writer object
excel_writer = pd.ExcelWriter(‘bitcoin_historical_data.xlsx’, engine=’openpyxl’)

# Write the DataFrame to an Excel sheet
df.to_excel(excel_writer, sheet_name=’Bitcoin Historical Data’)

# Save the Excel file
excel_writer.save()

print(“Data has been saved to bitcoin_historical_data.xlsx”)

# Make sure you have the required libraries installed and adjust the “start_date” and “end_date” variables to the desired date range for the historical data you want to download.

The code above allows you to download the data from Yahoo! Finance.

Download the Excel file with Bitcoin data

R code

The R program below written by Shengyu ZHENG allows you to download the data from Yahoo! Finance website and to compute summary statistics and risk measures about the Bitcoin.

Download R file

Data file

The R program that you can download above allows you to download the data for the Bitcoin from the Yahoo! Finance website. The database starts on September 17, 2014.

Table 3 below represents the top of the data file for the Bitcoin downloaded from the Yahoo! Finance website with the R program.

Table 3. Top of the data file for the Bitcoin.
Top of the file for the Bitcoin data
Source: computation by the author (data: Yahoo! Finance website).

Evolution of the Bitcoin

Figure 6 below gives the evolution of the Bitcoin from September 17, 2014 to December 31, 2022 on a daily basis.

Figure 6. Evolution of the Bitcoin.
Evolution of the Bitcoin
Source: computation by the author (data: Yahoo! Finance website).

Figure 7 below gives the evolution of the Bitcoin returns from September 17, 2014 to December 31, 2022 on a daily basis.

Figure 7. Evolution of the Bitcoin returns.
Evolution of the Bitcoin return
Source: computation by the author (data: Yahoo! Finance website).

Summary statistics for the Bitcoin

The R program that you can download above also allows you to compute summary statistics about the returns of the Bitcoin.

Table 4 below presents the following summary statistics estimated for the Bitcoin:

  • The mean
  • The standard deviation (the squared root of the variance)
  • The skewness
  • The kurtosis.

The mean, the standard deviation / variance, the skewness, and the kurtosis refer to the first, second, third and fourth moments of statistical distribution of returns respectively.

Table 4. Summary statistics for the Bitcoin.
Summary statistics for the Bitcoin
Source: computation by the author (data: Yahoo! Finance website).

Statistical distribution of the Bitcoin returns

Historical distribution

Figure 8 represents the historical distribution of the Bitcoin daily returns for the period from September 17, 2014 to December 31, 2022.

Figure 8. Historical distribution of the Bitcoin returns.
Historical distribution of the daily Bitcoin returns
Source: computation by the author (data: Yahoo! Finance website).

Gaussian distribution

The Gaussian distribution (also called the normal distribution) is a parametric distribution with two parameters: the mean and the standard deviation of returns. We estimated these two parameters over the period from September 17, 2014 to December 31, 2022. The annualized mean of daily returns is equal to 30.81% and the annualized standard deviation of daily returns is equal to 62.33%.

Figure 9 below represents the Gaussian distribution of the Bitcoin daily returns with parameters estimated over the period from September 17, 2014 to December 31, 2022.

Figure 9. Gaussian distribution of the Bitcoin returns.
Gaussian distribution of the daily Bitcoin returns
Source: computation by the author (data: Yahoo! Finance website).

Risk measures of the Bitcoin returns

The R program that you can download above also allows you to compute risk measures about the returns of the Bitcoin.

Table 5 below presents the following risk measures estimated for the Bitcoin:

  • The long-term volatility (the unconditional standard deviation estimated over the entire period)
  • The short-term volatility (the standard deviation estimated over the last three months)
  • The Value at Risk (VaR) for the left tail (the 5% quantile of the historical distribution)
  • The Value at Risk (VaR) for the right tail (the 95% quantile of the historical distribution)
  • The Expected Shortfall (ES) for the left tail (the average loss over the 5% quantile of the historical distribution)
  • The Expected Shortfall (ES) for the right tail (the average loss over the 95% quantile of the historical distribution)
  • The Stress Value (SV) for the left tail (the 1% quantile of the tail distribution estimated with a Generalized Pareto distribution)
  • The Stress Value (SV) for the right tail (the 99% quantile of the tail distribution estimated with a Generalized Pareto distribution)

Table 5. Risk measures for the Bitcoin.
Risk measures for the Bitcoin
Source: computation by the author (data: Yahoo! Finance website).

The volatility is a global measure of risk as it considers all the returns. The Value at Risk (VaR), Expected Shortfall (ES) and Stress Value (SV) are local measures of risk as they focus on the tails of the distribution. The study of the left tail is relevant for an investor holding a long position in the Bitcoin while the study of the right tail is relevant for an investor holding a short position in the Bitcoin.

Why should I be interested in this post?

Students would be keenly interested in this article discussing Bitcoin’s history and trends due to its profound influence on the financial landscape. Bitcoin, as a novel and dynamic asset class, presents a unique opportunity for students to explore the evolving world of finance. By delving into Bitcoin’s past, understanding its market trends, and assessing its impact on global economies, students can equip themselves with the knowledge and skills needed to navigate a financial landscape that is increasingly intertwined with cryptocurrencies and blockchain technology. Moreover, this knowledge can enhance their career prospects in an industry undergoing significant transformation and innovation.

Related posts on the SimTrade blog

About cryptocurrencies

   ▶ Snehasish CHINARA How to get crypto data

   ▶ Alexandre VERLET Cryptocurrencies

   ▶ Youssef EL QAMCAOUI Decentralised Financing

   ▶ Hugo MEYER The regulation of cryptocurrencies: what are we talking about?

About statistics

   ▶ Shengyu ZHENG Moments de la distribution

   ▶ Shengyu ZHENG Mesures de risques

   ▶ Jayati WALIA Returns

Useful resources

Academic research about risk

Longin F. (2000) From VaR to stress testing: the extreme value approach Journal of Banking and Finance, N°24, pp 1097-1130.

Longin F. (2016) Extreme events in finance: a handbook of extreme value theory and its applications Wiley Editions.

Data

Yahoo! Finance

Yahoo! Finance Historical data for Bitcoin

CoinMarketCap Historical data for Bitcoin

About the author

The article was written in September 2023 by Snehasish CHINARA (ESSEC Business School, Grande Ecole Program – Master in Management, 2022-2024).

Extreme returns and tail modelling of the S&P 500 index for the US equity market

Extreme returns and tail modelling of the S&P 500 index for the US equity market

Shengyu ZHENG

In this article, Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024) describes the statistical behavior of extreme returns of the S&P 500 index for the US equity market and explains how extreme value theory can be used to model the tails of its distribution.

The S&P 500 index for the US equity market

The S&P 500, or the Standard & Poor’s 500, is a renowned stock market index encompassing 500 of the largest publicly traded companies in the United States. These companies are selected based on factors like market capitalization and sector representation, making the index a diversified and reliable reflection of the U.S. stock market. It is a market capitalization-weighted index, where companies with larger market capitalization represent a greater influence on their performance. The S&P 500 is widely used as a benchmark to assess the health and trends of the U.S. economy and as a performance reference for individual stocks and investment products, including exchange-traded funds (ETF) and index funds. Its historical significance, economic indicator status, and global impact contribute to its status as a critical barometer of market conditions and overall economic health.

Characterized by its diversification and broad sector representation, the S&P 500 remains an essential tool for investors, policymakers, and economists to analyze market dynamics. This index’s performance, affected by economic data, geopolitical events, corporate earnings, and market sentiment, can provide valuable insights into the state of the U.S. stock market and the broader economy. Its rebalancing ensures that it remains current and representative of the ever-evolving landscape of American corporations. Overall, the S&P 500 plays a central role in shaping investment decisions and assessing the performance of the U.S. economy.

In this article, we focus on the S&P 500 index of the timeframe from April 1st, 2015, to April 1st, 2023. Here we have a line chart depicting the evolution of the index level of this period. We can observe the overall increase with remarkable drops during the covid crisis (2020) and the Russian invasion in Ukraine (2022).

Figure 1 below gives the evolution of the S&P 500 index from April 1, 2015 to April 1, 2023 on a daily basis.

Figure 1. Evolution of the S&P 500 index.
Evolution of the S&P 500 index
Source: computation by the author (data: Yahoo! Finance website).

Figure 2 below gives the evolution of the daily logarithmic returns of S&P 500 index from April 1, 2015 to April 1, 2023 on a daily basis. We observe concentration of volatility reflecting large price fluctuations in both directions (up and down movements). This alternation of periods of low and high volatility is well modeled by ARCH models.

Figure 2. Evolution of the S&P 500 index logarithmic returns.
Evolution of the S&P 500 index return
Source: computation by the author (data: Yahoo! Finance website).

Summary statistics for the S&P 500 index

Table 1 below presents the summary statistics estimated for the S&P 500 index:

Table 1. Summary statistics for the S&P 500 index.
summary statistics of the S&P 500 index returns
Source: computation by the author (data: Yahoo! Finance website).

The mean, the standard deviation / variance, the skewness, and the kurtosis refer to the first, second, third and fourth moments of statistical distribution of returns respectively. We can conclude that during this timeframe, the S&P 500 index takes on a slight upward trend, with relatively important daily deviation, negative skewness and excess of kurtosis.

Tables 2 and 3 below present the top 10 negative daily returns and top 10 positive daily returns for the S&P 500 index over the period from April 1, 2015 to April 1, 2023.

Table 2. Top 10 negative daily returns for the S&P 500 index.
Top 10 negative returns of the S&P 500 index
Source: computation by the author (data: Yahoo! Finance website).

Table 3. Top 10 positive daily returns for the S&P 500 index.
Top 10 positive returns of the S&P 500 index
Source: computation by the author (data: Yahoo! Finance website).

Modelling of the tails

Here the tail modelling is conducted based on the Peak-over-Threshold (POT) approach which corresponds to a Generalized Pareto Distribution (GPD). Let’s recall the theoretical background of this approach.

The POT approach takes into account all data entries above a designated high threshold u. The threshold exceedances could be fitted into a generalized Pareto distribution:

 Illustration of the POT approach

An important issue for the POT-GPD approach is the threshold selection. An optimal threshold level can be derived by calibrating the tradeoff between bias and inefficiency. There exist several approaches to address this problematic, including a Monte Carlo simulation method inspired by the work of Jansen and de Vries (1991). In this article, to fit the GPD, we use the 2.5% quantile for the modelling of the negative tail and the 97.5% quantile for that of the positive tail.

Based on the POT-GPD approach with a fixed threshold selection, we arrive at the following modelling results for the GPD for negative extreme returns (Table 4) and positive extreme returns (Table 5) for the S&P 500 index:

Table 4. Estimate of the parameters of the GPD for negative daily returns for the S&P 500 index.
Estimate of the parameters of the GPD for negative daily returns for the S&P 500 index
Source: computation by the author (data: Yahoo! Finance website).

Table 5. Estimate of the parameters of the GPD for positive daily returns for the S&P 500 index.
Estimate of the parameters of the GPD for positive daily returns for the S&P 500 index
Source: computation by the author (data: Yahoo! Finance website).

Figure 3. GPD for the left tail of the S&P 500 index returns.
GPD for the left tail of the S&P 500 index returns
Source: computation by the author (data: Yahoo! Finance website).

Figure 4. GPD for the right tail of the S&P 500 index returns.
GPD for the right tail of the S&P 500 index returns
Source: computation by the author (data: Yahoo! Finance website).

Applications in risk management

Extreme Value Theory (EVT) as a statistical approach is used to analyze the tails of a distribution, focusing on extreme events or rare occurrences. EVT can be applied to various risk management techniques, including Value at Risk (VaR), Expected Shortfall (ES), and stress testing, to provide a more comprehensive understanding of extreme risks in financial markets.

Why should I be interested in this post?

Extreme Value Theory is a useful tool to model the tails of the evolution of a financial instrument. In the ever-evolving landscape of financial markets, being able to grasp the concept of EVT presents a unique edge to students who aspire to become an investment or risk manager. It not only provides a deeper insight into the dynamics of equity markets but also equips them with a practical skill set essential for risk analysis. By exploring how EVT refines risk measures like Value at Risk (VaR) and Expected Shortfall (ES) and its role in stress testing, students gain a valuable perspective on how financial institutions navigate during extreme events. In a world where financial crises and market volatility are recurrent, this post opens the door to a powerful analytical framework that contributes to informed decisions and financial stability.

Download R file to model extreme behavior of the index

You can find below an R file (file with txt format) to study extreme returns and model the distribution tails for the S&P 500 index.

Download R file to study extreme returns and model the distribution tails for the S&P 500 index

Related posts on the SimTrade blog

About financial indexes

   ▶ Nithisha CHALLA Financial indexes

   ▶ Nithisha CHALLA Calculation of financial indexes

   ▶ Nithisha CHALLA The S&P 500 index

About portfolio management

   ▶ Youssef LOURAOUI Portfolio

   ▶ Jayati WALIA Returns

About statistics

   ▶ Shengyu ZHENG Moments de la distribution

   ▶ Shengyu ZHENG Mesures de risques

   ▶ Shengyu ZHENG Extreme Value Theory: the Block-Maxima approach and the Peak-Over-Threshold approach

   ▶ Gabriel FILJA Application de la théorie des valeurs extrêmes en finance de marchés

Useful resources

Academic resources

Embrechts P., C. Klüppelberg and T. Mikosch (1997) Modelling Extremal Events for Insurance and Finance Springer-Verlag.

Embrechts P., R. Frey, McNeil A.J. (2022) Quantitative Risk Management Princeton University Press.

Gumbel, E. J. (1958) Statistics of extremes New York: Columbia University Press.

Longin F. (2016) Extreme events in finance: a handbook of extreme value theory and its applications Wiley Editions.

Other resources

Extreme Events in Finance

Chan S. Statistical tools for extreme value analysis

Rieder H. E. (2014) Extreme Value Theory: A primer (slides).

About the author

The article was written in October 2023 by Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024).

Les distributions statistiques

Distributions statistiques : variable discrète vs variable continue

Shengyu ZHENG

Dans cet article, Shengyu ZHENG (ESSEC Business School, Grande Ecole – Master in Management, 2020-2024) explique les distributions statistiques pour des variables aléatoires discrètes et continues.

Variables aléatoires discrète et continue

Une variable aléatoire est une variable dont la valeur est déterminée d’après la réalisation d’un événement aléatoire. Plus précisément, la variable (X) est une fonction mesurable depuis un ensemble de résultats (Ω) à un espace mesurable (E).

X : Ω → E

On distingue principalement deux types de variables aléatoires : discrètes et continues.

Une variable aléatoire discrète prend des valeurs dans un ensemble dénombrable comme l’ensemble des entiers naturels. Par exemple, le nombre de points marqués lors d’un match de basket est une variable aléatoire discrète, car elle ne peut prendre que des valeurs entières telles que 0, 1, 2, 3, etc. Les probabilités associées à chaque valeur possible de la variable aléatoire discrète sont appelées probabilités de masse.

En revanche, une variable aléatoire continue prend des valeurs dans un ensemble non dénombrable comme l’ensemble des nombres réels. Par exemple, la taille ou le poids d’une personne sont des variables aléatoires continues, car elles peuvent prendre n’importe quelle valeur réelle positive. Les probabilités associées à une variable aléatoire continue sont déterminées par une fonction de densité de probabilité. Cette fonction permet de mesurer la probabilité que la variable aléatoire se situe dans un intervalle donné de valeurs.

Méthodes pour décrire des distributions statistiques

Afin de mieux comprendre une variable aléatoire, il y a plusieurs moyens pour décrire la distribution de la variable.

Calcul des statistiques

Une statistique est le résultat d’une suite d’opérations appliquées à un ensemble d’observations appelé échantillon et une mesure numérique qui résume une caractéristique de cet ensemble. Par exemple, la moyenne est un exemple de statistiques.
Les statistiques peuvent être divisées en deux types principaux : les statistiques descriptives et les statistiques inférentielles.

Les statistiques descriptives sont utilisées pour résumer et décrire les caractéristiques de base d’un ensemble de données. Elles comprennent des mesures telles que les moments d’une distribution (la moyenne, la variance, le skewness, le kurtosis, …). Une explication plus détaillée est disponible dans l’article Moments de la distribution.

Les statistiques inférentielles, quant à elles, sont utilisées pour faire des inférences sur une population à partir d’un échantillon de données. Elles incluent des tests d’hypothèses, des intervalles de confiance, des analyses de régression, des modèles prédictifs, etc.

Histogramme

Un histogramme est un type de graphique qui permet de représenter la distribution des données d’un échantillon. Il est constitué d’une série de rectangles verticaux, où chaque rectangle représente une plage de valeurs de la variable étudiée (appelée classe), et dont la hauteur correspond à la fréquence des observations de cette classe.

L’histogramme est un outil très utilisé pour visualiser la distribution des données et pour identifier les tendances et les formes dans les données pour les variables discrètes ainsi que continues discrétisées.

Fonction de masse et fonction de densité

Une fonction de masse de probabilité est une fonction mathématique qui permet de décrire la distribution de probabilité d’une variable aléatoire discrète.

La fonction de masse de probabilité associe à chaque valeur possible de la variable aléatoire discrète une probabilité. Par exemple, si X est une variable aléatoire discrète prenant les valeurs 1, 2, 3 et 4 avec des probabilités respectives de 0,2, 0,3, 0,4 et 0,1, alors la fonction de masse de probabilité de X (loi multinomiale) est donnée par :
P(X=1) = 0,2
P(X=2) = 0,3
P(X=3) = 0,4
P(X=4) = 0,1

Il est important de noter que la somme des probabilités pour toutes les valeurs possibles de la variable aléatoire doit être égale à 1, c’est-à-dire, pour toute variable aléatoire discrète X :
∑ P(X=x) = 1

Figure 1. Fonction de masse d’une loi multinomiale (pour une variable discrète).
Fonction de masse d’une loi multinomiale
Source : calcul par l’auteur

Par contre, une fonction de densité représente la distribution de probabilité d’une variable aléatoire continue. La fonction de densité permet de calculer la probabilité que la variable aléatoire prenne une valeur dans un intervalle donné.
Graphiquement, l’aire sous la courbe de la fonction de densité entre deux valeurs a et b correspond à la probabilité que la variable aléatoire prenne une valeur dans l’intervalle [a, b].

Il est important de noter que la fonction de densité est une fonction continue, positive et intégrable sur tout son domaine. L’intégrale de la fonction de densité sur l’ensemble des valeurs possibles de la variable aléatoire est égale à 1.

Figure 2. Fonction de densité d’une loi normale (pour une variable continue).
Fonction de densité d’une loi normale
Source : calcul par l’auteur

Fonction de répartition

La fonction de répartition (ou fonction de distribution cumulative) est une fonction mathématique qui décrit la probabilité qu’une variable aléatoire prenne une valeur inférieure ou égale à une certaine valeur donnée. Elle est définie pour toutes les variables aléatoires, qu’elles soient continues ou discrètes.
Pour une variable aléatoire discrète, la fonction de répartition F(x) est définie comme la somme des probabilités des valeurs inférieures ou égales à x :

F(x) = P(X ≤ x) = Σ P(X = xi) pour xi ≤ x

Pour une variable aléatoire continue, la fonction de répartition F(x) est définie comme l’intégrale de la densité de probabilité f(x) de -∞ à x :
F(x)=P(X≤x)= ∫-∞xf(t)dt

Exemples

Dans cette partie, nous allons prendre deux exemples d’analyse de distribution statistique, l’un d’une variable aléatoire discrète et l’autre d’une variable continue.

Variable discrète : résultat du lancer d’un dé à six faces

Le jeu de lancer de dé à six faces consiste à lancer un dé pour obtenir un résultat aléatoire entre 1 et 6, correspondant aux six faces du dé. Les résultats ne prennent que les valeurs entières (1, 2, 3, 4, 5 et 6) et ils ont tous une probabilité identique de 1/6.

Dans cet exemple, le code R permet de simuler N lancers de dé et de visualiser la distribution des N résultats à l’aide d’un histogramme. En utilisant ce code, il est possible de simuler des parties de lancer de dé et d’analyser les résultats pour mieux comprendre la distribution des probabilités.

Si cette expérience aléatoire est répétée 1 000 fois, nous arrivons à un résultat dont l’histogramme est comme :

Figure 3. Histogramme des résultats de lancers d’un dé à six faces.
Histogramme des résultats de lancers d’un dé à six faces
Source : calcul par l’auteur

Nous constatons que les résultats sont distribués d’une manière équilibrée et ont la tendance de converger vers la probabilité théorique 1/6.

Variable continue : rendments de l’indice CAC40

Le rendement d’un indice d’actions comme le CAC 40 pour le marché français est une variable aléatoire continue parce qu’elle peut prendre toutes les valeurs réelles.

Nous utilisons un historique de l’indice boursier journalier pour des cours de clôture de l’indice CAC 40 du 1er avril 2021 au 1er avril 2023 pour calculer des rendements journalières (rendements logarithmiques).

En finance, la distribution des rendements journalières de l’indice CAC 40 est souvent modélisée par une loi normale, même si la loi normale ne modélise pas forcément bien la distribution observée, surtout les queues de distributions observées. Dans le graphique ci-dessous, nous voyons que la distribution normale ne décrit pas bien la distribution réelle.

Figure 4. Fonction de densité des rendements journalières de l’indice CAC 40 (variable continue).
Fonction de densité des rendements journalières de l’indice CAC 40
Source : calcul par l’auteur

Pour des observations issues pour une variable continue, il est toujours possible de regrouper les observations dans des intervalles et de représenter dans un histogramme.

La table 1 ci-dessous donne les statistiques descriptives pour les rendements journalières de l’indice CAC 40.

Table 1. Statistiques descriptives pour les rendements journalières de l’indice CAC 40.

Statistiques descriptives Valeur
Moyenne 0.035
Médiane 0.116
Écart-type 1.200
Skewness -0.137
Kurtosis 6.557

Les résultats du calcul des statistiques descriptives correspondent bien à ce que nous pouvons remarquer du graphique. La distribution des rendements a une moyenne légèrement positive. La queue de la distribution empirique est plus épaisse que celle de la distribution normale vu les survenances des rendements (positives ou négatives) extrêmes.

Fichier R pour cet article

Download R file

A propos de l’auteur

Cet article a été écrit en octobre 2023 par Shengyu ZHENG (ESSEC Business School, Grande Ecole Program – Master in Management, 2020-2024).

Application de la théorie des valeurs extrêmes en finance de marchés

Gabriel FILJA

Dans cet article, Gabriel FILJA (ESSEC Business School, Executive Master in Senior Bank Management, 2022-2023 & Head of Hedging à Convera) présente des applications de la théorie des valeurs extrêmes en finance de marchés et notamment en gestion des risques de marchés.

Principe

La théorie des valeurs extrêmes (TVE), appelé théorème de Fisher-Tippet-Gnedenko tente de fournir une caractérisation complète du comportement de la queue pour tous les types de distributions de probabilités.

La théorie des valeurs extrêmes montre que la loi asymptotique des rentabilités minimale et maximale a une forme bien déterminée qui est largement indépendante du processus de rentabilités lui-même (le lien entre les deux distributions apparaît en particulier dans la valeur de l’indice de queue qui reflète le poids des queues de distribution). L’intérêt de la TVE dans la gestion du risque c’est de pouvoir calculer le quantile au-delà de 99% du seuil de confiance dans le cadre des stress tests ou de la publication des exigences réglementaires.

Gnedenko a démontré en 1943 par la Théorie des valeurs extrêmes la propriété qui s’applique à des nombreuses distributions de probabilités. Soit F(x) la fonction de répartition d’une variable x. u est une valeur de x située dans la partie droite de la queue de distribution.

La probabilité que x soit compris entre u et u+y est de F(y+u) – F(u) et la probabilité que x soit supérieur à u est 1-F(u). Soit Fu(y) la probabilité conditionnelle que x soit compris entre u et u+y sachant que x>u∶

Probabilité conditionnelle

Estimation des paramètres

Selon les résultats de Gnedenko, pour un grand nombre de distribution, cela converge vers une distribution généralisée de Pareto au fur et à mesure que u augmente :

Distribution_généralisée_Pareto

β est le paramètre d’échelle représente la dispersion de la loi des extrêmes
ξ est l’indice de queue qui mesure l’épaisseur de la queue et la forme

Selon la valeur de l’indice de queue, on distingue trois formes dedistribiution d’extrêmes :

  • Frechet ξ > 0
  • Weibull ξ < 0
  • Gumbel ξ = 0

L’indice de queue ξ reflète le poids des extrêmes dans la distribution des rentabilités. Une valeur positive de l’indice de queue signifie que les extrêmes n’ont pas de rôle important puisque la variable est bornée. Une valeur nulle donne relativement peu d’extrêmes alors qu’une valeur négative implique un grand nombre d’extrêmes (c’est le cas de la loi normale).

Figure 1 : Densité des lois des valeurs extrêmes
 Densité des lois des valeurs extrêmes
Source : auteur.

Tableau 1 : Fonctions de distribution des valeurs extrêmes pour un ξ > 0, loi de Frechet, ξ < 0 loi de Weibull et ξ = 0, loi de Gumbel. Fonctions de distribution des valeurs extrêmes
Source : auteur.

Les paramètres β et ξ sont estimés par la méthode de maximum de vraisemblance. D’abord il faut définir u (valeur proche du 95e centile par exemple). Une des méthodes pour déterminer ce seuil, c’est la technique appelée Peak Over Threshold (POT), ou méthode des excès au-delà d’un seuil qui se focalise sur les observations qui dépassent un certain seuil donné. Au lieu de considérer les valeurs maximales ou les plus grandes valeurs, cette méthode consiste à examiner toutes les observations qui franchissent un seuil élevé préalablement fixé.
L’objectif est de sélectionner un seuil adéquat et d’analyser les excès qui en découlent. Ensuite nous trions les résultats par ordre décroissant pour obtenir les observations telles que x>u et leur nombre total.

Nous étudions maintenant les rentabilités extrêmes pour l’action Société Générale sur la période 2011-2021. La Figure 2 représentes rentabilités journalières de l’action et les rentabilités extrêmes négatives obtenues avec l’approche des dépassements de seuil (Peak Over Threshold ou POT). Avec le seuil retenu de -7%, on obtient 33 dépassements sur 2 595 rentabilités journalières de la période 2011 à 2021.

Figure 2 : Sélection des rentabilités extrêmes négatives pour l’action Société Générale selon l’approche Peak Over Threshold (POT)
Sélection des rentabilités extrêmes pour le titre Société Genérale
Source : auteur.

Méthode d’estimation statistique

Nous allons maintenant voir comment déterminer les β et ξ en utilisant la fonction de maximum de vraisemblance qui s’écrit :

Fonction de vraisemblance

Pour un échantillon de n observations, l’estimation de 1-F(u) est nu/n. Dans ce cas, la probabilité inconditionnelle de x>u+y vaut :

Fonction de vraisemblance

Et l’estimateur de la queue de distribution de probabilité cumulée de x (pour un grand) est :

Estimateur queue distribution

Mon travail personnel a consisté à estimer le paramètre d’échelle β et le paramètre de queue ξ à partir de la formule par le maximum de vraisemblance en utilisant le solveur Excel. Nous avons précédemment déterminé n=0,07 par la méthode de POT en Figure 2, et n_u= 2595

Ainsi nous obtenons β=0,0378 et ξ=0,0393 ce qui maximise par la méthode du maximum de vraisemblance la somme du logarithme des valeurs extrêmes à un total de 73,77.

Estimation de la VaR TVE

Pour calculer le VaR au seuil q, nous obtenons F(VaR) = q

VaR TVE

Mon travail personnel a consisté à estimer la VaR du titre de la Société Générale de la période de 2011 à 2021 sur un total de 2595 cotations avec 33 dépassements de seuil (-7%). En appliquant les données obtenues à la formule nous obtenons :

VaR 99% Société Générale

Puis nous estimons la VaR à 99,90% et 99,95% :

VaR 99,90% Société Générale

Il n’est pas surprenant que l’extrapolation à la queue d’une distribution de probabilité soit difficile, pas parce qu’il est difficile d’identifier des distributions de probabilité possibles qui pourraient correspondre aux données observées (il est relativement facile de trouver de nombreuses distributions possibles différentes), mais parce que l’éventail des réponses qui peuvent vraisemblablement être obtenues peut être très large, en particulier si nous voulons extrapoler dans la queue lointaine où il peut y avoir peu ou pas de points d’observation directement applicables.

La théorie des valeurs extrêmes, si elle est utilisée pour modéliser le comportement de la queue au-delà de la portée de l’ensemble de données observées, est une forme d’extrapolation. Une partie de la cause du comportement à queue épaisse (fat tail) est l’impact que le comportement humain (y compris le sentiment des investisseurs) a sur le comportement du marché.

En quoi ça peut m’intéresser ?

Nous pouvons ainsi mener des stress tests en utilisant la théorie des valeurs extrêmes et évaluer les impacts sur le bilan de la banque ou encore déterminer les limites de risques pour le trading et obtenir ainsi une meilleure estimation du worst case scenario.

Autres articles sur le blog SimTrade

▶ Shengyu ZHENG Catégories de mesures de risques

▶ Shengyu ZHENG Moments de la distribution

▶ Shengyu ZHENG Extreme Value Theory: the Block-Maxima approach and the Peak-Over-Threshold approach

Ressources

Articles académiques

Falk M., J. Hüsler, et R.-D. Reiss, Laws of Small Numbers: Extremes and Rare Events. Basel: Springer Basel, 2011. doi: 10.1007/978-3-0348-0009-9.

Gilli M. et E. Këllezi, « An Application of Extreme Value Theory for Measuring Financial Risk », Comput Econ, vol. 27, no 2, p. 207‑228, mai 2006, doi: 10.1007/s10614-006-9025-7.

Gkillas K. and F. Longin (2018) Financial market activity under capital controls: lessons from extreme events Economics Letters, 171, 10-13.

Gnedenko B., « Sur La Distribution Limite Du Terme Maximum D’Une Serie Aleatoire », Annals of Mathematics, vol. 44, no 3, p. 423‑453, 1943, doi: 10.2307/1968974.

Hull J.et A. White, « Optimal delta hedging for options », Journal of Banking & Finance, vol. 82, p. 180‑190, sept. 2017, doi: 10.1016/j.jbankfin.2017.05.006.

Longin F. (1996) The asymptotic distribution of extreme stock market returns Journal of Business, 63, 383-408.

Longin F. (2000) From VaR to stress testing : the extreme value approach Journal of Banking and Finance, 24, 1097-1130.

Longin F. (2016) Extreme events in finance: a handbook of extreme value theory and its applications Wiley Editions.

Longin F. and B. Solnik (2001) Extreme Correlation of International Equity Markets, The Journal of Finance, 56, 649-676.

Roncalli T. et G. Riboulet, « Stress testing et théorie des valeurs extrêmes : une vision quantitée du risque extrême ».

Sites internet

Extreme Events in Finance

A propos de l’auteur

Cet article a été écrit en juillet 2023 par Gabriel FILJA (ESSEC Business School, Executive Master in Senior Bank Management, 2022-2023 & Head of Hedging à Convera).

How to get crypto data

How to get crypto data

 Snehasish CHINARA

In this article, Snehasish CHINARA (ESSEC Business School, Grande Ecole Program – Master in Management, 2022-2024) explains how to get crypto data.

Types of data

Number of coins

The information on the number of coins in circulation for a given currency is important to compute its market capitalization. Market capitalization is calculated by multiplying the current price of the cryptocurrency by its circulating number of coins (supply). This metric gives a rough estimate of the cryptocurrency’s total value within the market and its relative size compared to other cryptocurrencies. A lower circulating supply often implies a greater level of scarcity and rarity.

For cryptocurrencies (unlike fiat money), the number of coins in circulation is given by a mathematical formula. The number of coins may be limited (like the Bitcoin) or unlimited (like Ethereum and Dogecoin) over time.

Cryptocurrencies with limited supplies, such as Bitcoin’s maximum supply of 21 million coins, can be perceived as more valuable due to their finite nature. Scarcity can contribute to investor interest and potential price appreciation over time. A lower circulating supply might indicate the potential for future adoption and value appreciation, as the limited supply can create scarcity-driven demand, especially if the cryptocurrency gains more utility and usage.

Bitcoin’s blockchain also relies on a key equation to steadily allow new BTC to be introduced. The equation below gives the total supply of bitcoins:

Total supply of bitcoins

Figure 1 below represents the evolution of the supply of Bitcoins.

Figure 1. Evolution of the supply of Bitcoins

Source: computation by the author.

Market price of a coin

The market price of a cryptocurrency in the market holds crucial insights into how well the cryptocurrency is faring. Although not the sole factor, the market price significantly contributes to evaluating the cryptocurrency’s performance and its prospects. The market price of a cryptocurrency is a dynamic and intricate element that reflects a multitude of factors, both intrinsic and extrinsic. The gradual rise in market value over time indicates a willingness among investors and traders to offer higher prices for the cryptocurrency. This signifies a rising interest and strong belief in the project’s potential for the future. The market price reflects the collective sentiment of investors and traders. Comparing the market price of a cryptocurrency to other similar cryptocurrencies or benchmark assets like Bitcoin can provide insights into its relative strength and performance within the market. A rising market price can indicate increasing adoption of the cryptocurrency for various use cases. Successful projects tend to attract more users and real-world applications, which can drive up the price.

The value of cryptocurrencies in the market is influenced by a variety of elements, with each factor contributing uniquely to their pricing. One of the most significant influences is market sentiment and investor psychology. These factors can cause prices to shift based on positive news, regulatory changes, or reactive selling due to fear. Furthermore, the real-world implementation and usage of a cryptocurrency are crucial for its prosperity. Concrete use cases such as Decentralized Finance (DeFi), Non-Fungible Tokens (NFTs), and international transactions play a vital role in creating demand and propelling price appreciation. Meanwhile, adherence to basic economic principles is evident in the supply-demand dynamics, where scarcity due to limited issuance, halving events, and token burns interact with the balance between supply and demand.

With the number of coins in circulation, the information on the price of coins for a given currency is also important to compute its market capitalization.

Figure 2 below represents the evolution of the price of Bitcoin in US dollar over the period October 2014 – August 2023. The price corresponds to the “closing” price (observed at 10:00 PM CET at the end of the month).

Figure 2. Evolution of the Bitcoin price
Evolution of the Bitcoin price
Source: computation by the author (data source: Yahoo! Finance).

Trading volume

Trading volume is crucial when assessing the health, reliability, and potential price movements of a cryptocurrency. Trading volume refers to the total amount of a cryptocurrency that is bought and sold within a specific time frame, typically measured in units of the cryptocurrency (e.g., BTC) or in terms of its equivalent value in another currency (e.g., USD).

Trading volume directly mirrors market liquidity, with higher volumes indicative of more liquid markets. This liquidity safeguards against drastic price fluctuations when trading, contrasting with low-volume scenarios that can breed volatility, where even a single substantial trade may disproportionately shift prices. Price alterations are most reliable and meaningful when accompanied by substantial trading volume. Price movements upheld by heightened volume often hold greater validity, potentially pointing to more pronounced market sentiment. When price surges parallel rising trading volume, it suggests a sustainable upward trajectory. Conversely, low trading volume amid rising prices may hint at a forthcoming correction or reversal. Scrutinizing the correlation between price oscillations and trading volume can uncover potential divergences. For instance, ascending prices coupled with dwindling trading volume may suggest a weakening trend.

Figure 3 below represents the evolution of the monthly trading volume of Bitcoin over the period October 2014 – July 2023.

Figure 3. Evolution of the trading volume of Bitcoin
Evolution of the trading volume of Bitcoin
Source: computation by the author (data source: Yahoo! Finance).

Bitcoin data

You can download the Excel file with Bitcoin data used in this post as an illsutration.

Download the Excel file with Bitcoin data

Python code

You can download the Python code used to download the data from Yahoo! Finance.

Python script to download Bitcoin historical data and save it to an Excel sheet:

import yfinance as yf
import pandas as pd

# Define the ticker symbol and date range
ticker_symbol = “BTC-USD”
start_date = “2020-01-01”
end_date = “2023-01-01”

# Download historical data using yfinance
data = yf.download(ticker_symbol, start=start_date, end=end_date)

# Create a Pandas DataFrame
df = pd.DataFrame(data)

# Create a Pandas Excel writer object
excel_writer = pd.ExcelWriter(‘bitcoin_historical_data.xlsx’, engine=’openpyxl’)

# Write the DataFrame to an Excel sheet
df.to_excel(excel_writer, sheet_name=’Bitcoin Historical Data’)

# Save the Excel file
excel_writer.save()

print(“Data has been saved to bitcoin_historical_data.xlsx”)

# Make sure you have the required libraries installed and adjust the “start_date” and “end_date” variables to the desired date range for the historical data you want to download.

APIs

Calculating the total number of Bitcoins in circulation over time
Access – Bitcoin Blockchain data
By running a Bitcoin node or by using blockchain data providers like Blockchain.info, Blockchair, or a similar service.

Extract Block Data: Once you have access to the blockchain data, you would need to extract information from each block. Each block contains a record of the transactions that have occurred, including the creation (mining) of new Bitcoins in the form of a “Coinbase” transaction.

Calculate Cumulative Supply: You can calculate the cumulative supply of Bitcoins by adding up the rewards from each block’s Coinbase transaction. Initially, the block reward was 50 Bitcoins, but it halves approximately every four years due to the Bitcoin halving events. So, you’ll need to account for these halving in your calculations.

Code – python

import requests

# Replace ‘YOUR_API_KEY’ with your CoinMarketCap API key
api_key = ‘YOUR_API_KEY’

# Define the endpoint URL for CoinMarketCap’s API
url = ‘https://pro-api.coinmarketcap.com/v1/cryptocurrency/quotes/latest’

# Define the parameters for the request
params = {
‘symbol’: ‘BTC’,
‘convert’: ‘USD’,
‘CMC_PRO_API_KEY’: api_key
}

# Send the request to CoinMarketCap
response = requests.get(url, params=params)

# Parse the response JSON
data = response.json()

# Extract the circulating supply from the response
circulating_supply = data[‘data’][‘BTC’][‘circulating_supply’]

print(f”Current circulating supply of Bitcoin: {circulating_supply} BTC”)

## Replace ‘YOUR_API_KEY’ with your actual CoinMarketCap API key.

Why should I be interested in this post?

Cryptocurrency data is becoming increasingly relevant in these fields, offering opportunities for research, data analysis skill development, and even career prospects. Whether you’re aiming to conduct research, stay informed about the evolving financial landscape, or simply enhance your data analysis abilities, understanding how to access and work with crypto data is an asset. Plus, as the cryptocurrency industry continues to grow, this knowledge can open new career paths and improve your personal finance decision-making. In a rapidly changing world, diversifying your knowledge with cryptocurrency data acquisition skills can be a wise investment in your future.

Related posts on the SimTrade blog

▶ Alexandre VERLET Cryptocurrencies

▶ Youssef EL QAMCAOUI Decentralised Financing

▶ Hugo MEYER The regulation of cryptocurrencies: what are we talking about?

Useful resources

APIs

CoinMarketCap Source of API keys and program

CoinGecko Source of API keys and Programs

CryptoNews Source of API keys and Programs

Data sources

Yahoo! Finance Historical data for Bitcoin

Coinmarketcap Historical data for Bitcoin

Blockchain.com Market Data and charts on Bitcoin history

About the author

The article was written in October 2023 by Snehasish CHINARA (ESSEC Business School, Grande Ecole Program – Master in Management, (2022-2024).

Market Capitalization

Market Capitalization

Nithisha CHALLA

In this article, Nithisha CHALLA (ESSEC Business School, Grande Ecole Program – Master in Management, 2021-2023) explains Market Capitalization and its specificities.

What is Market Capitalization?

Market capitalization is a key metric used to assess the size and value of publicly traded companies. It represents the company’s value for the owners of the company (the shareholders or stockholders). This metric allows companies to be classified as large-cap, mid-cap, or small-cap based on their respective market-capitalization sizes.

Large-cap companies are typically more established, with market capitalizations exceeding several billion dollars. They are more stable and frequently represent industry leaders. In the US stock market, Apple, Microsoft, and Amazon are examples of large-cap companies.

Mid-cap companies fall between large-cap and small-cap companies. They are typically businesses that have seen moderate growth and may still have room for expansion. Mid-cap companies are frequently regarded as having a good balance of growth potential and stability. For example, Etsy Inc., DocuSign Inc., Spotify Technology S.A. etc.

Small-cap companies have lower market capitalizations than large-cap and mid-cap firms. They are generally thought to have greater growth potential, but also greater risk due to their smaller size and possibly limited resources. NeoGenomics, Inc., Clean Energy Fuels Corp., Axon Enterprise Inc. etc.

Mathematical formula?

The general formula for calculating market capitalization:

Market Capitalization = Current Share Price x Number of Outstanding Shares

In this formula:
“Current Share Price” refers to the price of a single share of the company’s stock. It is the latest transaction price. As Market Capitalization is usually computed every day, the current share price corresponds to the closing price of the trading session.

“Number of Outstanding Shares” represents the total number of shares of the company’s stock that are publicly available and held by investors.

The Significance of Stock Price

When considering market capitalization, the stock price is an important factor to consider. It represents the current market price at which a company’s shares are bought and sold. Stock prices, which are influenced by factors such as supply and demand, market sentiment, and company-specific news, play a critical role in determining a company’s market capitalization.

On the short term, as the number of shares issued by the company is stable, the stock price is the main factor which influences market capitalization.

How is the Number of Shares Computed?

The total number of outstanding shares of a company’s stock is used to calculate market capitalization. The outstanding shares are those that the company has issued and are held by shareholders, which include individual investors, institutional investors, and insiders.

The number of outstanding shares can be found in the company’s financial statements, specifically the balance sheet and the notes to the financial statements.

Which Shares are Included?

The outstanding shares generally include common shares or ordinary shares, which are the most common types of shares issued by companies. Preferred shares or other types of securities that may have different rights or characteristics are typically excluded from the calculation of market capitalization.

When we compute market capitalization, we take into consideration all outstanding shares of stock, which include publicly traded shares plus restricted shares held by the top management team and the founders of the company. Note that market capitalization is different from the float which takes into consideration only the shares available for trading in the secondary market.

If a company has different classes of shares with different voting rights or other characteristics, each class of shares may have its own market capitalization calculation based on the respective share price and the number of outstanding shares for that class.

Market capitalization provides an estimate of the overall value of the publicly traded portion of a company and is commonly used as a measure to compare companies or track changes in a company’s value over time.

Why should I be interested in this post?

Understanding market capitalization allows management students to analyze the financial health and performance of companies. By considering market capitalization along with other financial indicators, students can assess the relative size and value of companies in the market. Management students need to evaluate investment opportunities and determine the attractiveness of different stocks or companies based on their market capitalization and growth potential. Large-cap companies often offer stability and lower risk, while small-cap companies tend to be riskier but may have higher growth potential. Management students need to understand the risk-return tradeoff associated with different market capitalization segments.

Related posts on the SimTrade blog

   ▶ Nithisha CHALLA Financial indexes

   ▶ Nithisha CHALLA Calculation of financial indexes

   ▶ Nithisha CHALLA Float

   ▶ Nithisha CHALLA Top 5 companies by market capitalization in India

   ▶ Nithisha CHALLA Top 5 companies by market capitalization in China

   ▶ Nithisha CHALLA Top 5 companies by market capitalization in the United States

   ▶ Nithisha CHALLA Top 5 companies by market capitalization in Europe

Useful resources

Fidelity Investments Market capitalization

Wikipedia Market capitalization

Motley Fool An Example of Market Capitalization

About the author

The article was written in June 2023 by Nithisha CHALLA (ESSEC Business School, Grande Ecole Program – Master in Management, 2021-2023).