Telecommunications forecasting

All telecommunications service providers perform forecasting calculations to assist them in planning their networks.[1] Accurate forecasting helps operators to make key investment decisions relating to product development and introduction, advertising, pricing etc., well in advance of product launch, which helps to ensure that the company will make a profit on a new venture and that capital is invested wisely.[2]

Why is forecasting used?

Forecasting can be conducted for many purposes, so it is important that the reason for performing the calculation is clearly defined and understood. Some common reasons for forecasting include:[2]

  • Planning and Budgeting – Using forecast data can help network planners decide how much equipment to purchase and where to place it to ensure optimum management of traffic loads.
  • Evaluation – Forecasting can help management decide if decisions that have been made will be to the advantage or detriment of the company.
  • Verification – As new forecast data becomes available it is necessary to check whether new forecasts confirm the outcomes predicted by the old forecasts.

Knowing the purpose of the forecast will help to answer additional questions such as the following:[2]

  • What is being forecast? – events, trends, variables, technology
  • Level of focus – focus on a single product or a whole line, focus on a single company or the entire industry
  • How often is forecasting conducted? – daily, weekly, monthly, annually
  • Do the methods used reflect the decisions needed to be taken by management?
  • What are the resources available to make decisions? – lead-time, staff, relevant data, budget, etc.
  • What are the types of errors that could occur and what will they cost the company?

Factors influencing forecasting

When forecasting it is important to understand which factors may influence the calculation, and to what extent. A list of some common factors can be seen below:[2]

  • Technology
    • subscriber access - fibre, wireless, wired, cellular, TDMA, CDMA, handsets
    • application - telephony, PBXs, ISDN, videoconferencing, LANs, teleconferencing, internetworking, WANs
    • technology - broadband, narrowband, carriers, fibre to the curb, DSL
  • Economics
    • Global Economics - Economic climate, predictions, estimates, economic factors, interest rates, prime rate, growth, management's outlook, investors' confidence, politics
    • Sectoral Economics - trends in industry, investors’ outlook, telecommunications, emerging technologies growth rate, recessions, and slowdowns
    • Macroeconomics - inflation, GDP, exports, monetary exchange rates, imports, government deficit, economic health
  • Demographics
    • Measurement of number of people in regions – how many were born, are living and died within a time period
    • The way people live – health, fertility, marriage rates, ageing rate, conception, mortality

Data preparation

Before forecasting is performed, the data being used must be "prepared". If the data contains errors, then the forecast result will be equally flawed. It is therefore vital that all anomalous data be removed. Such a procedure is known as data "scrubbing".[2] Scrubbing data involved removing data points known as "outliers". Outliers are data that lie outside the normal pattern. They are usually caused by anomalous and often unique events and so are unlikely to recur. Removing outliers improves data integrity and increases the accuracy of the forecast.

Forecasting methods

There are many different methods used to conduct forecasting. They can be divided into different groups based on the theories according to which they were developed:[2]

Judgment-based methods

Judgment-based methods rely on the opinions and knowledge of people who have considerable experience in the area that the forecast is being conducted. There are two main judgment based methods:[2]

  • Delphi method – The Delphi method involves directing a series of questions to experts. The experts provide their estimates regarding future development. The researcher summarizes the replies and sends the summary back to the experts, asking them if they wish to revise their opinions. The Delphi method is not very reliable and has only worked successfully in very rare cases.
  • ExtrapolationExtrapolation is the usual method of forecasting. It is based on the assumption that future events will continue to develop along the same boundaries as previous events i.e. the past is a good predictor of the future. The researcher first acquires data about previous events and plots it. He then determines if there a pattern has emerged, and if so, he attempts to extend the pattern into the future and in so doing begins to generate a forecast of what is likely to happen. To extend patterns, researchers generally use a simple extrapolation rule, such as the S-shaped logistic function or Gompertz curves, or the Catastrophic Curve to help them in their extrapolation. It is in deciding which rule to use that the researcher’s judgment is required.

Survey methods

Survey methods are based on the opinions of customers and are thus reasonably accurate if performed correctly. In performing a survey, the survey’s target group needs to be identified.[3] This can be achieved by considering why the forecast is being conducted in the first place. Once the target group has been identified, a sample must be chosen. The sample is a sub-set of the target and must be chosen so that it accurately reflects everyone in the target group.[3] The survey must then pose a series of questions to the sample group and their answers must be recorded.

The recorded answers must then be analyzed using statistical and analytical methods. The average opinion and the variation about that mean are statistical analytical techniques that can be used.[3] The results of the analysis should then be checked using alternative forecasting methods and the results can be published.[3] It must be kept in mind that this method is only accurate if the sample is a balanced and accurate subset of the target group and if the sample group has accurately answered the questions.[3]

Time series methods

Time series methods are based on measurements taken of events on a periodic basis.[2] These methods use such data to develop models which can then be used to extrapolate into the future, thereby generating the forecast. Each model operates according to a different set of assumptions and is designed for a different purpose. Examples of Time Series Methods are:[2]

  • Exponential smoothing – This method is based on a moving average of the data being analyzed, e.g. a moving average of sales figures
  • Cyclical and seasonal trends – This method focuses on previous data to help define a pattern or trend that occurs in cyclic or seasonal periods. Researchers can then use current data to adjust the pattern so that it fits this period’s data, and in so doing can forecast what will happen during the remainder of the current season or cycle.
  • Statistical models – Statistical models allow the researcher to develop statistical relationships between variables. These models are based on current data and by means of extrapolation, a future model can be created. Extrapolation techniques are based on standard statistical laws, thus improving the accuracy of the prediction. Statistical techniques not only produce forecasts but also quantify precision and reliability. Examples of this are the ERLANG B and C formulae, developed in 1917 by the Danish mathematician Agner Erlang.

Analogous methods

Analogous Methods involve finding similarities between foreign events and the events that are being studied. The foreign events are usually selected at a time when they are more "mature" than current events. No foreign event will perfectly mirror current events and this must be kept in mind so that any necessary corrections can be made. By examining the foreign, more mature, set of events, the future of current events can be forecast.[2]

Analogous methods can be split up into two groups namely:[2]

  • Qualitative (symbolical) models
  • Quantitative (numeric) models

Causal models

Causal Models are the most accurate form of forecasting, and the most complex. They involve creating a complex and complete model of the events being forecast. The model must include all possible variables, and must be able to predict every possible outcome.

Causal Models are often so complex that they can only be created on computers. They are developed using data from a set of events. The model is only as accurate as the data used to develop it.[2]

Combination forecasts

Combination Forecasts combine the methods discussed above. The advantage is that in most cases accuracy is increased; however a researcher must be careful that the disadvantages of each of the above methods do not combine to produce compound errors in forecasts. Examples of combination forecasts include: "Integration of Judgment and Quantitative Forecasts" and "Simple and Weighted Averages".[2]

Determining forecast accuracy

It is difficult to determine the accuracy of any forecast, as it represents an attempt to predict future events, which is always challenging. To help improve and test forecast accuracy researchers use many different checking methods. A simple checking method involves the use of several different forecasting methods and comparing the results to see if they are more or less equal. Another method can involve statistically calculating the errors in the forecasting calculation and expressing them in terms of the root mean squared error, thereby providing an indication of the overall error in the method. A sensitivity analysis can also be useful, as it determines what will happen if some of the original data upon which the forecast was developed turned out to be wrong. Determining forecast accuracy, like forecasting itself, can never be performed with certainty and so it is advisable to ensure that input data is measured and obtained as accurately as possible, the most appropriate forecasting methods are selected, and the forecasting process is conducted as rigorously as possible.[2]

gollark: We will be using improved esoalgorithms™ soon.
gollark: Well, look, it doesn't use NLP logic as good as the blasphemy detector yet.
gollark: oh no.
gollark: oh no.
gollark: Fairness.

References

  1. Farr R.E., Telecommunications Traffic, Tariffs and Costs – An Introduction For Managers, Peter Peregrinus, 1988.
  2. Kennedy I. G., Forecasting, School of Electrical and Information Engineering, University of the Witwatersrand, 2003.
  3. Goodman A., Surveys and Sampling, 7 November 1999 http://deakin.edu.au/~agoodman/sci101/index.html Last accessed 30 January 2005.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.