Melden

13. May 2022•0 gefällt mir•105 views

13. May 2022•0 gefällt mir•105 views

Downloaden Sie, um offline zu lesen

Melden

Bildung

Join CMT Level 1, 2 & 3 Program Courses & become a professional Technical Analyst, CMT USA Best COACHING CLASSES. CMT Institute Live Classes by Expert Faculty. Exams are available in India. Best Career in Financial Market. https://www.ptaindia.com/chartered-market-technician/

Behavioural Finance - CHAPTER 19 – De – Bubbling Alpha Generation | CMT Leve...Professional Training Academy

Behavioural Finance - CHAPTER 20 – Behavioural Techniques | CMT Level 3 | Cha...Professional Training Academy

Volatility - CH 22 - Hedging with VIX Derivatives | CMT Level 3 | Chartered M...Professional Training Academy

Risk Management - CH 6 - Statistical Analysis | CMT Level 3 | Chartered Marke...Professional Training Academy

Asset Relationships - CH 11 - Relative Strength | CMT Level 3 | Chartered Mar...Professional Training Academy

Classical Methods - CH 25 - Multi Time Frames | CMT Level 3 | Chartered Mark...Professional Training Academy

- 1. SECTION V Volatility Analysis
- 2. CHAPTER 23 Advance Techniques of Volatility Analysis
- 3. Learning Objectives ▪ Analyze the relationship between a system‘s entry signals and changes in market volatility ▪ Distinguish whether a system‘s entry signal should be filtered based on liquidity ▪ Calculate the expected move of an index or security based on volatility measures ▪ Explain the basics of using Fractal Efficiency, Chaos Theory or genetic algorithms in trading ▪ Explain the basics of using Neural Network (Machine Learning) programming to trade with market data 3
- 4. Measuring Volatility ▪ Volatility of most price series, whether stocks, financial markets, commodities, or the spread between two series, is directly proportional to the increase and decrease in the price level. ▪ Higher prices translate into higher volatility. ▪ This price-volatility relationship has been described as lognormal in the stock market, and is similar to a percentage-growth relationship ▪ Log relationship is one way of explaining the relationship between price and volatility. ▪ A linear price scale is plotted on the side of the chart there is an equal distance between the prices, and each unit change on the chart is represented by the same vertical distance on the scale, regardless of what price level the asset is at when the change occurs. ▪ A logarithmic price scale is plotted so that the prices in the scale are not positioned equidistantly; instead, the scale is plotted in such a way that two equal percent changes are plotted as the same vertical distance on the scale. ▪ 4
- 5. The Price-Volatility Relationship ▪ Two stocks trading at the same price can have very different volatility. It may not only be the nature of their business, but those stocks in the news tend to have more volatility. - When sizing positions in order to equalize risk. - When balancing position sizes for pairs trading or market neutral baskets. - For assessing risk in portfolios. ▪ While each of these issues is covered in other places in this book, it is volatility that is the key driver of risk, and measuring volatility correctly is necessary to control that risk. 5
- 6. Adjusting for a Base Price ▪ Somewhere below the cost of production, volatility goes to zero. Call that level the base price for the purpose of calculating volatility. ▪ Then prices can be adjusted by dividing the current price by the base price (p0) or subtracting the base price from the current price, then taking the natural log (ln) of the adjusted value. ▪ The charts of either of these adjusted prices will look similar to the unadjusted semi-log plot; however, when the base price is high, the adjusted price will be more useful. 6
- 7. Adjusting for a Base Price Exceptions: ▪ 1. Interest rates trade as prices on the futures markets, which is inverse to the yield. When evaluating long-term volatility for rates, and in most cases when using percentages, yield should be used. ▪ 2. Foreign exchange has no base price, only equilibrium, the price that all traders, and the governments, accept as fair value for the moment. This situation is always short-lived. Prices get more volatile when they move away from equilibrium in either direction. ▪ 3. Energy is controlled by a cartel. They attempt to set the supply and target a price range. 7
- 8. Determining the Base Price Two straightforward ways : ▪ Use a linear regression of prices. Apply a standard least squares regression, available on Excel in Data/Data Analysis, with sequential integers as x and the closing prices as y. ▪ Then create the regression line as p-t = a + bx, where a is the y- intercept and b the slope. To find the base price, calculate the residuals, rest = pt − p-t then find the minimum residual value. Subtract that minimum value from the regression line to get the base price line, The base price continues to increase with time. ▪ 2. Use a linear regression of volatility. Instead of price, using a measure of volatility related to the price level will give a more direct view of the price volatility relationship. 8
- 9. Determining the Base Price 9 Scatter Diagrams of Volatility Against Price for Monthly Copper, 1974 to July 2011. (a) Price and Monthly Returns. (b) Price and Monthly Price Range
- 10. The Time Interval ▪ The time period over which volatility is measured is also a significant factor in the price-volatility relationship. A longer period means that the net changes over n days may turn into months or years. ▪ Longer measurement periods give higher volatility values; however, the rate at which volatility increases will decline over time. 10 Change in Volatility Relative to the Interval Over Which It Is Measured.
- 11. An Example of a Lognormal Calculation ▪ Over the long term and under average market conditions, the relationship between actual price changes and volatility is expected to be: ▪ For example, if the price on day t is 20 and the price on day s is 40, then the natural log of prices on those days are ln (20) = 2.99 and ln (40) = 3.99. If the volatility is $1.00 when pt = 20, then the volatility is expected to be $1.23 when ps = 40. - If the volatility at 20 is 1.00, then the volatility at 40 is 1.23. When using a spreadsheet for your calculations, note that the function ln is not the same as the function log. 11
- 12. Volatility Measures 12 The change in price over n days The maximum price fluctuation during the n days The average true range The sum of the absolute price changes over n days Classic annualized volatility for daily data
- 13. Volatility Measures 13
- 14. Volatility Measures 14 1. The change in price over n days (Figure 27.5a): 2. The maximum price fluctuation during the n days (Figure 27.3b): where Max and Min are the same as the Trade Station functions Highest and Lowest. 3. The average true range over the past n days: where True range is a function that returns the maximum range from the combination of today‘s high, low, and previous close. 4. The sum of the absolute price changes over n days: For stocks, the sum of the returns should be used. 5. Classic annualized volatility for daily data
- 15. Comparing Annualize Volatility and Average True Range ▪ Results from annualized volatility and average true range can be very different and can significantly affect trading decisions, position size, and risk assessment. ▪ The ATR is considerably smoother on a daily basis and shows smaller jumps when prices gap. In some cases, such as the far right, the true range is increasing while the annualized volatility is flat. ▪ Both can be converted to dollar values by multiplying by the current price. 15
- 16. Relative Volatility ▪ Relative volatility (RV) can be defined as the volatility over a short period divided by the volatility over a longer period, where the longer period is typical of the normal volatility, ▪ where Vt is any of the volatility measures, and n and m are calculation periods, where f × n = m, where f >= 5, but f >= 10 would be better. ▪ Lagging the Longer Period ▪ A better measure would be to lag the longer calculation so that it ends before the shorter one starts, ▪ Then the shorter calculation, n, goes from t − n + 1 to t, and the longer one, m, goes from t − m – n + 1 to t − n, non-overlapping periods. ▪ This method will also help on the back side of a volatile period, when the typical calculation includes the recent volatility, making the declining volatility seem normal, rather than still volatile. 16
- 17. Implied Volatility, VIX ▪ The CBOE‘s volatility index, VIX, reflecting the implied volatility of the S&P options, is available on a real-time basis. ▪ VIX was originally the volatility of an options index, OEX, a weighted value of the implied volatilities of 8 puts and calls in the S&P 100, expressed as a percentage of the index price. ▪ If the VIX is 25%, and the SPX is 1600, then VIX is forecasting 25% volatility for at- the-money options, relative to the price of the SPX for 30-day rolling expiration period. ▪ The 30-day calendar period is equivalent to about 21 trading days, and there are 252 trading days in the year, 1 standard deviation of the volatility (equal to a 25% move in the SPX at 1600) becomes ▪ Then, an implied volatility of 25% when the SPX is at 1600 is equal to a 68% (1 standard deviation) chance of a price change of ±115.47 within the next 30 calendar days (21 trading days). 17
- 18. Intraday Volatility and Volume ▪ Intraday volatility has a pattern that is identical to volume, highest at the open, then declining to its lowest point at mid-session, and rising again as the trading day ends. ▪ To find the correlation between the intraday pattern of volatility and that of volume, a simple linear regression can be solved ▪ where all values are calculated at time t. The resulting correlation, R = 0.595, is statistically significant for NASDAQ 100 volatility and volume. The correlation between volatility and volume are highest at the beginning and end of the day. 18
- 19. Intraday Volatility and Volume ▪ Meissner and Cercioglu suggest that this volatility pattern, with the corresponding volume that provides liquidity, can be traded by being long options at the beginning and end of the day, profiting from the gamma (the rate of change of delta, which is the rate of change of the futures price with respect to the rate of change of the underlying asset). ▪ During the quiet mid-session period, a short options position may be used to profit from theta, the time decay. 19
- 20. Predicting Volatility with Trading Ranges 20 Thomas Byronic—On- Balance True Range VIX Trading Systems - Connors MarketSci Blog Gerald Appel on VIX Fractals, Chaos, and Entropy Trends and Price Noise Trends and Interest Rate Carry Neural Networks Modeling Human Behavior Genetic Algorithms Liquidity Trade Selection Using Volatility
- 21. Thomas Byronic—On-Balance True Range ▪ To visualize the change in volatility, Thomas Bierovic has created an On-Balance True Range by following the same rules as On- Balance Volume (OBV), but substituting the true range calculation for volume. ▪ He then calculates a 9-day exponential smoothing of the On- Balance True Range and uses the crossovers of the oscillator and smoothed oscillator to confirm signals. ▪ The highs and lows may come at nearly the same time as other oscillators, the relative peaks and valleys may offer the trader new insights. For many traders, this simple interpretation can help separate high and low volatility conditions. 21
- 22. VIX Trading Systems - Connors ▪ VIX is considered a mean reverting indicator ▪ Connors - Larry Connors has based a number of trading systems on the VIX. ▪ Connors treats volatility as mean-reverting. Entries are based on a minor reversal in the VIX. The rules for buying (selling are the reverse) are: ▪ 1. Today‘s VIX high must be higher than the VIX high of the past 10 days. ▪ 2. Today‘s VIX must close below its open. ▪ 3. Yesterday‘s VIX must have closed above its open. ▪ 4. Today‘s VIX range must be greater than the ranges of the past 3 days. ▪ 5. If conditions 1–4 are met then buy S&P futures on the close and exit in 3 days. ▪ Connors is actually looking for turning points in the VIX. 22
- 23. VIX Trading Systems - Connors ▪ Connors mean-reverting VIX strategy uses the RSI for timing. If ▪ 1. The S&P > 200-day moving average ▪ 2. The 2-day RSI of the VIX > 90 ▪ 3. Today‘s VIX open > yesterday‘s VIX close ▪ then buy on the close and exit when the 2-day RSI closes > 65. ▪ The specific pattern that precedes a buy signal in the S&P ends with a range expansion. ▪ This expansion is likely to mark the end of a short-term upwards move in the VIX. ▪ A decline in the VIX that follows eases the way for a short-term rally in the S&P. 23
- 24. MarketSci Blog ▪ Interesting websites is MarketSci Blog, which offers numerous creative strategies for equities trading. ▪ a 10-day exponential smoothing (EMA) and a 10-day simple moving average (SMA), both applied to the VIX index, and buys the VIX when the EMA falls below the SMA, then sells short when the EMA moves above the SMA. ▪ It does this based on the concurrent closing prices. ▪ A fast execution is essential for mean-reverting trades ▪ Unlike most other strategies, this is remarkably symmetric, with longs and shorts performing equally. 24
- 25. Gerald Appel on VIX ▪ Gerald Appel gives his own thoughts on trading VIX as: ▪ Buy when there are high levels of VIX, implying broad pessimism. ▪ There are no reliable sell signals using VIX. ▪ Volatility tends to increase during weaker market climates. ▪ The stock market is likely to advance for as long as volatility remains stable or decreasing. ▪ Volatility System ▪ Book staber uses the average true range (ATR) over the past n days as the basis for a simple volatility strategy: ▪ Buy if the next close, Ct+1, rises by more than k × ATRt(n) from the current close Ct ▪ Sell if the next close, Ct+1, falls by more than k × ATRt(n) from the current close Ct ▪ The volatility factor k is given as approximately 3, but can be varied higher or lower to make the trading signals less or more frequent, respectively. This method falls into the category of volatility breakout. 25
- 26. Trends and Price Noise ▪ The trader who enters a new trend sooner will be more profitable, but the erratic behavior of prices makes a faster response less reliable. What makes a trend so difficult to identify is noise. ▪ Noise is the erratic movement of price and, by definition, it is unpredictable. In engineering, this type of behavior that shows no patterns is called white noise ▪ When measuring noise using the efficiency ratio, or fractal efficiency, different markets had different levels of noise, but that the equity index markets had the most noise and short-term rates the least. ▪ Noise is the product of a large number of market participants buying and selling at different times for different purposes. Each has its own objectives and time frame. ▪ Noise can be the result of price shocks—unexpected events, in particular news or economic reports, that causes change that persists for varying periods into the future. Noise has most of the qualities of a sequence of random numbers. 26
- 27. Trade Selection Summary ▪ Every system has a risk, even so-called riskless trades. Arbitrage, when done properly, has virtually no risk; however, it may be so competitive that the opportunities are rare and the margin of profit small. ▪ In the final analysis, you can‘t remove the risk, only delay it or move it around. If your system shows that it has essentially no risk, then it‘s important to rethink your development process to find the flaw. ▪ When selecting trades to eliminate, the easiest place to begin is by associating performance with volatility or price level. ▪ While some systems perform better in an environment of higher volatility, your strategy may show the best return relative to risk when there is less volatility. When you filter trades you will always get rid of good ones while, hopefully, removing more of the bad ones. 27
- 28. Trade Selection Using Volatility 28 High Volatility Eliminate or Delay? Constructing a Volatility Filter Standard Deviation Measurement Entry Filter Results High Volatility Exits: Reducing the Risk Ranking Based on Volatility Trade Selection Summary
- 29. Trade Selection Using Volatility ▪ High volatility is clearly related to greater risk, but low volatility may also mean that there is a smaller chance for profits. ▪ The reasonable expectations for selecting trades based on volatility: ▪ Entering on very high volatility is exposure to very high risk. Returns from high volatility trades may range from large profits to large losses. ▪ Entering on extreme low volatility seems safe, but prices often have no direction and produce small, frequent losses. Waiting for an increase in activity before entering might improve returns. ▪ Exiting a position when prices become very volatile should reduce both profits and risk, but may come too late. 29
- 30. Eliminate or Delay? ▪ At the time of an entry signal, there are two choices. ▪ The trade can be completely eliminated by filtering, ▪ Or it can be delayed until the high volatility drops or the low volatility increases to an acceptable level. ▪ Before starting, we can theorize that short-term trading would most likely eliminate, not delay, trades that fall outside the acceptable volatility range because there are many trades and each is held for a short time. ▪ At the other end of the spectrum are the long-term trend trades, held for weeks, that would suffer if the exceptionally large profit was missed. 30
- 31. Constructing a Volatility Filter ▪ Calculating the volatility is simple to program using any spreadsheet or strategy testing software. The following steps were used here: ▪ 1. Calculate a moving average trend. Use one fast trend and one slow trend. ▪ 2. Calculate the volatility, using any one of the methods described early in this chapter, but not including the volatility of the current day. ▪ 3. Enter a new trade (on the close) if today‘s volatility is (a) above the low filter threshold or (b) below the high filter threshold. ▪ 4. Exit a current position if the volatility is above the high filter threshold and, based on testing, ▪ (a) the current price change has moved in a profitable direction or ▪ (b) the current price change has moved in a losing direction. 31
- 32. Standard Deviation Measurement ▪ A standard deviation was used to determine the volatility threshold level because these levels are associated with probabilities. ▪ A high-volatility filter with a 1- standard deviation threshold means that no trades were taken if the volatility was above the average volatility plus 1 standard deviation, the top 16% occurrences. ▪ A 2-standard deviation threshold filters out volatility in the top 2.5%, and 3 standard deviations restricts only the top 0.13%. ▪ In all cases, a 20-day standard deviation will be used, comparable to VIX. ▪ The program TSM Moving Average will used as the underlying strategy. ▪ This enters and exits trades based entirely on the direction of the moving average trend line, not on the price penetration of the trend line. 32
- 33. Entry Filter Results ▪ A standard deviation was used to determine the volatility threshold level because these levels are associated with probabilities. ▪ A high-volatility filter with a 1- standard deviation threshold means that no trades were taken if the volatility was above the average volatility plus 1 standard deviation, the top 16% occurrences. ▪ A 2-standard deviation threshold filters out volatility in the top 2.5%, and 3 standard deviations restricts only the top 0.13%. ▪ In all cases, a 20-day standard deviation will be used, comparable to VIX. ▪ The program TSM Moving Average will used as the underlying strategy. ▪ This enters and exits trades based entirely on the direction of the moving average trend line, not on the price penetration of the trend line. 33
- 34. Ranking Based on Volatility ▪ Gerald Appel offers an additional approach to trade selection by creating a ranking method for mutual funds. ▪ Select only funds with average to below average volatility. ▪ Add the 3-month and 12-month performance together to get single value. ▪ Rank the funds. ▪ Only invest in the top 10%. 34
- 35. Liquidity ▪ One reason for this disappointing performance is the lack of understanding of market liquidity at the time of execution. ▪ Consider two systems: ▪ 1. A trend-following method, which will trigger buy or sell orders as prices rise or fall during the day. ▪ 2. A countertrend system, which sells and buys at relative intraday highs or lows. 35
- 36. Liquidity ▪ The endpoints are shown to contribute the largest part of the profits when, in reality, no executions may have been possible near those levels. ▪ Assuming the ability to execute at all points on the actual distribution bb′, the approximate profit contribution is shown as dd′. ▪ For trend-following systems, no profits should be expected when buy or sell orders are placed at the extremes of the day. ▪ The actual price distribution bb′ is the maximum that could be expected from such a system; in reality, the first day is usually a loss. 36
- 37. Liquidity ▪ The dotted line cc′ represents the apparent profit for a countertrend system that makes the assumption of a straight- line volume distribution. ▪ The endpoints are shown to contribute the largest part of the profits when, in reality, no executions may have been possible near those levels. ▪ Assuming the ability to execute at all points on the actual distribution bb′, the approximate profit contribution is shown as dd′. 37
- 38. Neutral Networks 38 Artificial Neural Networks Selecting and Preprocessing the Inputs Success Criteria The Training Process Success Criteria Reducing the Number of Decision Levels and Neurons Modeling Human Behavior
- 39. Neural Networks ▪ Neural networks are recognized as a powerful tool for uncovering market relationships. ▪ The technique offers exceptional ability for discovering nonlinear relationships between any combination of fundamental information, technical indicators, and price data. ▪ The operation of an artificial neural network can be thought of as a feedback process, similar to the Pavlovian approach to training a dog: ▪ 1. A bell rings. ▪ 2. The dog runs to 1 of 3 bowls. ▪ 3. If right, the dog gets a treat; if wrong, the dog gets a shock. ▪ 4. If trained, stop; if not trained, go back to Step 1. 39
- 40. Neural Networks ▪ Terminology of Neural Networks ▪ Neurons are the cells that compose the brain; they process and store information. ▪ Networks are groups of neurons. ▪ Dendrites are receivers of information, passing it directly to the neurons. ▪ Axons are pathways that come out of the neuron and allow information to pass from one neuron to another. ▪ Synapses exist on the path between neurons and may inhibit or enhance the flow of information between neurons. They can be considered selectors. 40
- 41. Artificial Neural Networks ▪ Terminology of Neural Networks ▪ Neurons are the cells that compose the brain; they process and store information. ▪ Networks are groups of neurons. ▪ Dendrites are receivers of information, passing it directly to the neurons. ▪ Axons are pathways that come out of the neuron and allow information to pass from one neuron to another. ▪ Synapses exist on the path between neurons and may inhibit or enhance the flow of information between neurons. They can be considered selectors. 41
- 42. Artificial Neural Networks ▪ The human brain works in a way very similar to the artificial neural network . It groups and weighs the data, combines them into subgroups, and finally produces a decision. ▪ The human process of weighing the data is complex and not necessarily transparent; that is, we may never know the precise flow of data, ▪ The weighting factors are found to show that unemployment has a strong negative effect on prices, the GDP a strong positive effect, and inventories have a weak positive effect. ▪ The other items had no consistent predictive ability and received a weight of zero. This feedback process is called training. 42
- 43. Artificial Neural Networks 43
- 44. Selecting and Preprocessing the Inputs 44 ▪ We must decide which factors are most likely affecting the direction of stocks and the ability to anticipate that direction, then prepare data that contains information with those qualities ▪ There are countless factors that might influence the direction of stocks; the more you choose, the slower the solution and the greater the chance of a less robust model. ▪ If you choose too few, they may not contain enough information; therefore, the preprocessing problem requires practice. ▪ You may also construct a number of simple trading systems that show profits and include their basic components as inputs to the neural network. ▪ You might create a performance series for a specific system that has only values –1, 0, and 1, representing short, neutral, and long market positions.
- 45. The Training Process 45 ▪ At the heart of the neural network approach is the feedback process used for training shown in . ▪ This is the part of neural networks that many people refer to as the learning process. ▪ Weighting factors are found using a method called a genetic algorithm. ▪ As the training proceeds, these weighting factors are randomly mutated, or changed, until the best combination is found. ▪ The genetic algorithm changes and combines weighting factors in a manner referred to as survival of the fittest, giving preference to the best and discarding the worst.
- 46. A Training Example 46 ▪ The five most relevant fundamental factors: GNP, unemployment, inventories, the U.S. dollar index, and short-term interest rates. ▪ This test does not use any preprocessed data, such as trends or indicators. To simplify the process, the following approach is taken: ▪ 1. Each input is normalized so that it has values between +100 and –100, indicating strength to weakness, with 0 as neutral. ▪ 2. When the combined values of the five indicators exceeds +125, we will enter a long position; when the combined value is below −125, we will enter a short. ▪ 3. Values between +125 and −125 are considered neutral to the trading strategy.
- 47. Success Criteria 47 ▪ Determining success during the learning process is a matter of measuring the ANN output against the training data, looking for convergence. ▪ The common measurements are the average correlation (adjusted for the number of parameters), the t-statistic, the t2-statistic, and the F-statistic. ▪ The t2-statistic is unique to neural networks and measures the nonlinear relationships between two variables.
- 48. Reducing the Number of Decision Levels and Neurons 48 ▪ When there are many decision layers and many neurons, the inputs can be combined and recombined in many different ways, allowing very specific patterns to be found. ▪ The more specific, the greater the chance that the final solution will be over fitted, that is, it will be fine-tuned to such specific patterns in the past that those patterns will not occur in the future. ▪ Neural networks can be highly complex and require experience before they can be used efficiently. ▪ Too many inputs and combinations increase the time of testing and increase the chance of a solution that is over fit. ▪ Too few values can produce a result that is too general, has large risk, and is not practical. It is best to begin with the most general and proceed in clear steps toward a more specific solution.
- 49. Modeling Human Behavior 49 ▪ Neural networks are considered a learning process similar to the parallel architecture of the human brain. ▪ In Early in 2003 as well as after the subprime crisis of 2008, corporate earnings and income growth were most important. Other reports were, for the most part, ignored. ▪ After earnings improved and the stock market had rallied, traders looked for employment statistics to improve as a means for sustaining economic growth and the stock market rally. ▪ At that point, earnings were no longer as important as more jobs. ▪ A neural network can be constructed to reflect this selection process that makes one or two economic reports more important than others given the state of the economy.
- 50. Genetic Algorithms 50 Representation of a Genetic Algorithm Initial Chromosome Pool Fitness Mutation Mating Propagation Converging on a Solution Putting It into Practice: Simulated Performance Multiple Seeding Replication of Hedge Funds
- 51. Genetic Algorithms ▪ The concept of a genetic algorithm is based on Darwin‘s theory of survival of the fittest. ▪ In the real world, a mutation with traits that improve any creature‘s ability to survive will continue to procreate. ▪ a genetic algorithm is actually a sophisticated search method that replaces the standard optimization, it uses a technique that parallels the survival of the fittest. ▪ Standard statistical criteria are used in the selection process to qualify the results. ▪ Searching for a large, optimal set of parameters or finding the best portfolio allocation takes minutes using a genetic algorithm; a standard sequential search may take weeks at the same computing speed. 51
- 52. Representation of a Genetic Algorithm ▪ The most basic component of a genetic algorithm is a gene; a number of genes will comprise an individual, and a combination of individuals (and therefore genes) is a chromosome. ▪ Chromosome represents a potential solution, a set of trading rules or parameters where the genes are the specific values and calculations. These in turn form individuals that represent rules that ultimately form a trading strategy. 52
- 53. Representation of a Genetic Algorithm ▪ Chromosome might be a rule to buy on strength: ▪ 1. If a 10-day moving average is less than yesterday‘s close and a 5-day ▪ stochastic is greater than 50, then buy. ▪ Chromosome 2 could be a rule that buys on weakness: ▪ 1. If a 20-day exponential is less than yesterday‘s low and a 10-day RSI is less than 50, then buy. ▪ If we rewrite these two chromosomes in a notational form, the genes and individuals in its structure become more apparent: ▪ 1. Chromosome 1: MA, 10, <, C, [0], &, Stoch, 5, >, 50, 1 ▪ 2. Chromosome 2: Exp, 20, <, L, [1], &, RSI, 10, <, 50, 1 53
- 54. Representation of a Genetic Algorithm ▪ For example, the 10- and 20-day averages in gene 2 of chromosomes 1 and 2 could be changed to 5 and 15 days; or, the indicators ▪ Stoch and RSI could be changed to MACD and Momentum. ▪ A combination of trading rules, or chromosomes, will create a trading strategy. Before continuing, the following steps will be needed to use the genetic algorithm to find the best results: ▪ 1. A clear way of representing the chromosomes and their component individuals and genes. ▪ 2. A fitness criterion to decide that one chromosome is better than another. ▪ 3. A propagation procedure that determines which chromosomes will survive and in what manner. ▪ 4. A process for mutation (introducing new characteristics) and mating (combining genes) to give chromosomes with greater potential a better chance for survival. 54
- 55. Representation of a Genetic Algorithm Initial Chromosome Pool - Eleven lists are needed, one for each unique gene. 1. Trend type, 1 of 5 choices: a moving average, exponential smoothing, linear regression, breakout, or step-weighted average. 2. Trend calculation period, a number between 1 and 200. 3. Trend relational operator, 1 of 3 choices: <, <=, >. 4. Price used in the trend calculation, 1 of 4 choices: C, (H + L + C)/3, (H + L)/2, indexed value. 5. Reference data or lag: a number between 1 and 10. 6. Method of combining individuals, 1 of 2 choices: and or or. 7. Indicator type, 1 of 5 choices: RSI, stochastic, MACD, momentum, Fisher transform (all indicators must be transformed to return values between −100 and +100). 8. Indicator calculation period, a number between 1 and 50. 9. Relational operator, 1 of 2 choices: > or <. 10. Comparison value for indicator and relational operator, a number between −100 and +100. 11. Market action, 1 of 2 choices: buy or sell. 55
- 56. Representation of a Genetic Algorithm ▪ Fitness - defining a fitness criterion, or objective function, which can be used to rank the chromosomes. ▪ A fitness criterion must combine the most important features associated with a successful trading strategy: ▪ Net profits or profits per trade. ▪ The number of trades or a sample error criterion. ▪ The smoothness of the results or a reward-to-risk ratio ▪ To measure that result, the following might be used: ▪ where ▪ PPT = the profits per trade ▪ NT = the number of trades ▪ GP = the gross profits ▪ GL = the gross losses 56
- 57. Representation of a Genetic Algorithm ▪ Propagation - The process of natural selection allows only the best individuals to survive. ▪ A strong propagation criterion is used to encourage the survival of chromosomes with the highest ranking, as determined by the fitness test. ▪ When an individual has a high fitness score, it is allowed to create more offspring; therefore, it becomes a larger part of the population. ▪ When it has a low score, it creates fewer offspring, or no offspring, and eventually disappears from the population. 57
- 58. Fractals, Chaos, and Entropy 58 Chaos theory Fractal Dimension Entropy Chaotic Patterns and Market Behavior
- 59. Fractals, Chaos, and Entropy ▪ Chaos theory is a way to describe the complex behavior of nonlinear systems, those that cannot be described by a straight line. It is also called nonlinear dynamics. ▪ One method of measuring chaotic systems is with various geometric shapes. This effort has resulted in an area of mathematics now ▪ In the real world, however, there are no straight lines; if you look closely enough—using a microscope if necessary—all ―straight lines‖ have ragged edges and all may be described as chaotic called fractal geometry 59
- 60. Fractal Dimension ▪ Fractal dimension is the degree of roughness or irregularity of a structure or system. ▪ Using Fractal Efficiency ▪ The Kaufman‘s Efficiency Ratio is formed by dividing the absolute value of the net change in price movement over n periods by the sum of all component moves, taken as positive numbers, over the same n periods. ▪ If the ratio approaches the value 1.0 then the movement is smooth (not chaotic); if the ratio approaches 0, then there is great inefficiency, chaos, or noise. ▪ This same measurement has been renamed fractal efficiency. Kaufman related this to trending and non trending patterns, when the ratio approached 1.0 and 0, respectively. 60
- 61. Fractal Dimension ▪ Each market has its unique underlying level of noise, the measurement of fractal efficiency should be consistent over all markets. Markets may vary in volatility although their chaotic behavior is technically the same ▪ The interpretation of fractal efficiency as noise allows trading rules to be developed. ▪ For example, a market with less noise should be entered quickly using a trending system, while it would be best to wait for a better price if the market has been rated as high noise. ▪ A noisy market is one that continues to change direction, while an efficient market is smooth. When viewed in the long term, the level of market noise should determine the type of strategy that should be applied to each market. 61
- 62. Chaotic Patterns and Market Behavior ▪ Each market has its unique underlying level of noise, the measurement of fractal efficiency should be consistent over all markets. Markets may vary in volatility although their chaotic behavior is technically the same 62
- 63. Entropy—Predicting by Similar Situations ▪ Each market has its unique underlying level of noise, the measurement of fractal efficiency should be consistent over all markets. Markets may vary in volatility although their chaotic behavior is technically the same ▪ The interpretation of fractal efficiency as noise allows trading rules to be developed. ▪ For example, a market with less noise should be entered quickly using a trending system, while it would be best to wait for a better price if the market has been rated as high noise. ▪ A noisy market is one that continues to change direction, while an efficient market is smooth. When viewed in the long term, the level of market noise should determine the type of strategy that should be applied to each market. 63
- 64. Main Points to Remember - Higher prices translate into higher volatility - Linear price scale - equal distance between the prices - logarithmic price scale - equal percent changes - Volatility that is the key driver of risk, and measuring volatility correctly is necessary to control that risk. - The time period over which volatility is measured is also a significant factor in the price-volatility relationship. - Volatility Measures Methods:- - The change in price over n days - The maximum price fluctuation during the n days - The average true range - The sum of the absolute price changes over n days (Relative Volatility) - Classic annualized volatility for daily data (Annualized Volatility)
- 65. Predicting Volatility with Trading Ranges 65 Thomas Byronic—On- Balance True Range VIX Trading Systems - Connors MarketSci Blog Gerald Appel on VIX Fractals, Chaos, and Entropy Trends and Price Noise Trends and Interest Rate Carry Neural Networks Modeling Human Behavior Genetic Algorithms Liquidity Trade Selection Using Volatility
- 66. Main Points to Remember - Intraday Volatility and Volume (Options Time & Strategy in Intraday) - Predicting Volatility with Trading Ranges - On-Balance True Range by following the same rules as On-Balance Volume (OBV) - VIX Trading Systems - Connors - Market Sci Blog (10SMA & 10 EMA on VIX to Generate Buy & Sell Signal) - Gerald Appel on VIX (Volatility Breakout – Buy System) - Trends and Price Noise (Volume) - The change in price over n days - The maximum price fluctuation during the n days - The average true range - The sum of the absolute price changes over n days (Relative Volatility) - Classic annualized volatility for daily data (Annualized Volatility) - Trade Selection Using Volatility 1 - High Volatility 2 - Eliminate or Delay?(At the time of Entry Signal) 3 - Constructing a Volatility Filter 4 - Standard Deviation Measurement 5 - High Volatility Exits: Reducing the Risk 6 - Ranking Based on Volatility
- 67. Main Points to Remember - One Main Reason of disappointing performance is the lack of understanding of market liquidity at the time of execution. - Fractals, Chaos, and Entropy - Chaos theory is a way to describe the complex behavior of nonlinear systems - Fractal dimension is the degree of roughness or irregularity of a structure or system. - Each market has its unique underlying level of noise, the measurement of fractal efficiency should be consistent over all markets. Markets may vary in volatility although their chaotic behavior is technically the same - Entropy—Predicting by Similar Situations
- 68. Main Points to Remember - Neural Networks - Neural networks are recognized as a powerful tool for uncovering market relationships. - A genetic algorithm is actually a sophisticated search method that replaces the standard optimization, it uses a technique that parallels the survival of the fittest. - Modeling human behavior is to consider the human as a device with a large number of internal mental states, each with its own particular control behavior and interstate transition probabilities.
- 69. Thanks