EGG PRICE FORECASTING USING NEURAL NETWORKS

2001 Poultry Science Association, Inc.
EGG PRICE FORECASTING
USING NEURAL NETWORKS
H. A. AHMAD, G. V. DOZIER1, and D. A. ROLAND, SR.2
107 Food Animal Production Center,
Tuskegee University, Tuskegee, AL 36088
Phone: (334) 724-4276
Fax: (334) 724-4277
e-mail: [email protected]
Primary Audience: Poultry Market Analysts, Poultry Economists, Extension Personnel,
Researchers, Computer Scientists
SUMMARY
Various models, including linear regression employing different variables of interest, have been
used in the past to predict the future market price of shelled eggs. These models, however, could
not account for most of the variations in market egg price, notwithstanding timely and expensive
data collection. A new approach using neural networks, a branch of artificial intelligence, has been
used in this project to forecast egg price. The results indicated better-fit lines and higher R2. A
general regression neural network proved more accurate than a back propagation neural network.
Neural networks can offer a more efficient alternative to traditional forecasting and prediction
techniques. However, reliable data collection and proper manipulation of such data remains the
under girding of any successful neural network model.
Key words: Egg price, forecasting, neural networks
2001 J. Appl. Poult. Res. 10:162–171
DESCRIPTION OF PROBLEM
Despite being called the most reasonably
priced among all agriculture commodities, egg
price has fluctuated greatly yearly, monthly, and
even daily [1]. Forecasting future egg price is a
complex phenomenon that, besides other factors,
is mainly driven by the market forces of demand
and supply. This demand and supply, in turn,
has many underlying factors that affect future
egg price. Some of these factors may include,
but are not limited to, the number of egg-laying
hens in operation, egg production, feed cost,
number of hatchable eggs placed for the replacement pullets, molted versus replacement flocks,
climate, seasonality, number of eggs being ex1
2
ported, and shipment schedules. Market forces
are further complicated by such factors as consumer behavior, new research that affects this
behavior (dietary cholesterol intake, Salmonella
enteritidis infection, etc.), and, of course, advertising. It would be extremely difficult to comprehend these factors and collect the required data in
a timely fashion to create any reasonable model.
Even if one does so, such model will not be able
to account for all the variations in egg pricing.
The agriculture economist makes a forecast
based on subjective judgement of some of the
above-mentioned factors and then adjusts the
outcome with the prevalent market scenarios [2,
3]. This approach may work sometimes but not
other times.
To whom correspondence should be addressed. Computer Science Department, Dunstan Hall, Auburn University, AL 36849.
341 Poultry Science Department, Auburn University, Auburn, AL 36849-5416.
AHMAD ET AL.: NEURAL NETWORK FORECASTING
With the advancement of computer technology and the related software availability, there
are new options that might be more promising
than the traditional approach. One such approach
is artificial neural networks (ANN). ANN is a
computational model that uses a set of processing elements (or nodes) closely related to
neurons in the brain (hence the name, neural
networks). These nodes are interconnected in a
network that can then recognize patterns in data
as it is exposed to the data. In a sense, the network learns from experience just as people do.
This learning distinguishes neural networks
from traditional computing programs that simply
follow instructions in a fixed sequential order.
It is beyond the scope of this paper to comprehensively review these networks; a brief description, however, is given in the background
section of this paper. Interested readers are encouraged to consult a textbook [4] on neural
networks or artificial intelligence. Successful
ANN applications have been found in many areas, including medical diagnostics, finger printing, stock market predictions, and robotics [5,
6]. Several studies have been conducted to examine the potential for ANN uses in the agricultural
sciences (e.g., fertility detection [7], produce inspection [8], disease detection [9], and amino
acid level prediction [10]). The current research
project has focused on the use of this renewed
computer technology to predict the future egg
price and evaluate its efficiency.
BACKGROUND
A neural network is a computer program
(series of instructions) that somewhat behaves
like a biological brain. Million of neurons in the
biological brain work together in parallel, each
trying to solve a small part of a complex problem. Based on the problem-solving methods of
humans, this type of problem solving (divide and
conquer) seems to be very efficient to recognize
speech and image data, to make decisions based
on past experiences, and to associate and apply
the acquired knowledge to new situations.
Neural networks learn by examples. To train
a neural net under supervision for a specific
problem, it need to have good examples, in
which the inputs and outputs are already known.
Based on these examples, the net builds a model
for the problem. Training data can be obtained
163
FIGURE 1. Schematic view of neural network.
from historical problem data in which the outcomes are already known or by creating sample
problems and solutions with the help of experts.
A typical neural net usually has three layers
of neurons, each of which is connected to the
neurons in the next layer (Figure 1). Each connection has a weight associated with it. Input
values in the first layer are weighted and passed
on to the hidden layer. Neurons in the hidden
layer produce outputs by applying an activation
function to the sum of the weighted input values.
These outputs are then weighted by the connections between the hidden and output layer. The
output layer produces the desired results.
The net learns by adjusting its interconnection weights repeatedly so that the output neurons produce results close to the correct outputs
in the training data. Eventually, if the problem
is learned, the weights become stable. The real
power of the trained net lies in producing good
results for data that it has not previously observed.
The artificial neurons received inputs, in this
project, from the weekly of the first 4 wk; the
data of the fifth week corresponded to output.
When the information was loaded into an ANN,
it was scaled from the current numeric range to
scales of [0,1] or [−1,1].
After variables are imported into the ANN
software program and scaled, a calibration set
is extracted for use during ANN training. The
percentage of the database extracted and the
method of pattern extraction may be altered. A
pattern is a single row of the database or single
observation. A rotation method of extraction selects pattern in the order they appear in the database. A random method randomly chooses the
calibration patterns.
JAPR: Research Report
164
TABLE 1. Quote of white-shelled, large-sized eggs in the southeast US ($/dozen)A
YEAR
WEEK OF
1993
1994
1995
1996
1997
1-Jan
8-Jan
15-Jan
22-Jan
29-Jan
Jan Average
0.74
0.74
0.74
0.77
0.78
0.75
0.77
0.70
0.69
0.70
0.75
0.72
0.71
0.70
0.71
0.72
0.71
0.71
0.89
0.95
1.05
1.02
0.90
0.96
0.85
0.85
0.91
0.98
0.99
0.92
5-Feb
12-Feb
19-Feb
26-Feb
Feb Average
0.77
0.73
0.70
0.72
0.73
0.77
0.77
0.77
0.77
0.77
0.69
0.70
0.73
0.74
0.72
0.88
0.89
0.93
0.97
0.92
0.98
0.90
0.84
0.84
0.89
5-Mar
12-Mar
19-Mar
26-Mar
Mar Average
0.78
0.85
0.91
0.95
0.87
0.77
0.78
0.80
0.77
0.78
0.71
0.69
0.70
0.71
0.70
0.98
0.98
0.98
0.98
0.98
0.83
0.86
0.93
0.98
0.90
2-Apr
9-Apr
16-Apr
23-Apr
30-Apr
Apr Average
0.95
0.89
0.80
0.77
0.73
0.83
0.70
0.69
0.67
0.64
0.61
0.66
0.71
0.71
0.71
0.68
0.62
0.69
0.98
0.93
0.88
0.84
0.78
0.88
0.90
0.80
0.77
0.76
0.76
0.80
7-May
14-May
21-May
28-May
May Average
0.70
0.69
0.70
0.74
0.71
0.61
0.68
0.71
0.69
0.67
0.61
0.62
0.63
0.63
0.62
0.76
0.78
0.81
0.84
0.80
0.76
0.76
0.76
0.75
0.76
4-Jun
11-Jun
18-Jun
25-Jun
June Average
0.76
0.76
0.76
0.76
0.76
0.68
0.66
0.64
0.64
0.66
0.63
0.66
0.70
0.72
0.68
0.85
0.84
0.83
0.83
0.84
0.71
0.69
0.71
0.75
0.72
2-Jul
9-Jul
16-Jul
23-Jul
30-Jul
July Average
0.76
0.73
0.71
0.75
0.79
0.75
0.64
0.67
0.73
0.75
0.75
0.71
0.72
0.72
0.81
0.90
0.87
0.80
0.83
0.82
0.85
0.88
0.88
0.85
0.77
0.81
0.88
0.95
0.91
0.86
6-Aug
13-Aug
20-Aug
27-Aug
Aug Average
0.81
0.81
0.81
0.77
0.80
0.73
0.71
0.71
0.72
0.72
0.77
0.75
0.75
0.75
0.76
0.88
0.91
0.94
0.94
0.92
0.80
0.75
0.74
0.78
0.77
3-Sep
10-Sep
17-Sep
24-Sep
Sep Average
0.71
0.68
0.68
0.72
0.70
0.72
0.72
0.72
0.67
0.71
0.77
0.82
0.85
0.84
0.82
0.94
0.94
0.94
0.94
0.94
0.85
0.89
0.90
0.88
0.88
Continued
AHMAD ET AL.: NEURAL NETWORK FORECASTING
165
TABLE 1 Continued. Quote of white-shelled, large-sized eggs in the southeast US ($/dozen)A
YEAR
WEEK Of
1993
1994
1995
1996
1997
1-Oct
8-Oct
15-Oct
22-Oct
29-Oct
Oct Average
0.75
0.75
0.75
0.74
0.74
0.75
0.64
0.64
0.67
0.72
0.74
0.68
0.80
0.78
0.82
0.86
0.86
0.82
0.92
0.89
0.89
0.89
0.92
0.90
0.81
0.78
0.78
0.80
0.87
0.81
5-Nov
12-Nov
19-Nov
26-Nov
Nov Average
0.74
0.76
0.77
0.75
0.76
0.74
0.74
0.74
0.74
0.74
0.90
0.98
1.01
1.01
0.98
0.97
1.03
1.11
1.16
1.07
0.96
1.04
1.05
1.05
1.03
3-Dec
10-Dec
17-Dec
24-Dec
Dec Average
0.70
0.71
0.76
0.82
0.75
0.74
0.75
0.75
0.75
0.75
1.01
1.01
0.95
0.89
0.97
1.15
1.14
1.01
0.90
1.05
1.01
0.98
0.96
0.91
0.97
A
Egg prices 1998, Urner Barry Publications, Inc.
When a network with back propagation architecture is presented with a training set, the
neuron transforms sums of inputs into weights
that are transferred to other neurons. The difference between the predicted output and the actual
training output is computed. The error is propagated backward through the hidden layer to the
input layers. The connection weights between
neurons and layers are adjusted until the output
error is minimized. All of the training set data
is presented until the network is able to duplicate
the training set with success. This trained ANN
can then be used to predict outputs when given
inputs upon which it has not been trained.
The learning rate is essential in back propagation network training. Each time that input is
presented to the network, the weights leading to
the output are modified to produce a smaller
error between the network prediction of output
and the actual output values present in the calibration set. The amount of connection modification is determined by the learning rate multiplied
times the error. For example, if the learning rate
is 0.75, the weight change will be ³⁄₄ of the error.
Higher learning rates result in larger weight
changes and faster training.
A general regression neural network
(GRNN) functions by measuring how far an output prediction is from the training calibration
set output in N dimensional space. N is the number of inputs in the problem. When a new pattern
is presented to the network, it is compared in N
dimensional space to all of the patterns in the
training set to determine how far in distance it
is from those patterns. The output predicted by
the network is a proportion of all of the outputs
in the training set. The proportion is based on
how far the new pattern is from the given patterns in the training set.
The success of the GRNN networks is dependent on a smoothing factor instead of a learning
rate and momentum. The smoothing factor must
be greater than 0 and can usually range from
0.01 to 1 with good results. A default-smoothing
factor is calculated when calibration is used in
training. Higher smoothing factors cause a more
relaxed surface fit through the data.
As with a regression analysis equation, either
type of ANN architecture can be overfit. Just as
a number of variables can be made to fit a nonlinear line exactly, so too can the ANN be overtrained to fit the training data perfectly. The
problem is that an overtrained ANN (or regression analysis equation) looses the ability to generalize and predict on data upon which it has
not been trained. If the ANN overstrains, it starts
to memorize the training set and becomes poorer
at predicting the calibration set. As training pro-
JAPR: Research Report
166
TABLE 2. Historical USDA data: egg price, number of hens, egg storage, and number of chickens hatched
DATE
(month-yr)
EGG PRICE
($/dozen)
NUMBER OF HENS
(million)
EGG STORAGE
(million pounds)
CHICKENS HATCHED
(million)
Jan-93
Feb-93
Mar-93
Apr-93
May-93
Jun-93
0.75
0.73
0.87
0.83
0.71
0.76
236.7
236.9
236.1
235.9
234.6
234.3
17.16
16.74
16.95
15.06
14.32
15.49
34.852
33.984
38.232
37.143
36.741
35.587
Jul-93
Aug-93
Sep-93
Oct-93
Nov-93
Dec-93
0.75
0.80
0.70
0.75
0.76
0.75
235
236.6
236.9
239.2
239.7
240.5
15.09
17.6
18.14
14.37
14.04
13.53
33.98
31.455
31.775
31.634
30.073
30.446
Jan-94
Feb-94
Mar-94
Apr-94
May-94
Jun-94
0.72
0.77
0.78
0.66
0.67
0.66
241.3
240.1
240.5
240.7
238.5
237.8
13.72
14.76
15.84
15.63
16.35
15.2
33.236
31.086
33.489
35.657
35.322
31.985
Jul-94
Aug-94
Sep-94
Oct-94
Nov-94
Dec-94
0.71
0.72
0.71
0.68
0.74
0.75
236.9
237.3
241.2
243.8
245.2
247.5
15.42
18.97
19.74
17.81
20.05
19.1
29.613
31.295
31.587
32.066
26.075
30.166
Jan-95
Feb-95
Mar-95
Apr-95
May-95
Jun-95
0.71
0.72
0.70
0.69
0.62
0.68
248.3
245.8
245.6
244
242.1
238.9
19.49
19.51
18.27
18.46
17.33
18.14
32.375
32.745
36.021
35.02
37.482
34.948
Jul-95
Aug-95
Sep-95
Oct-95
Nov-95
Dec-95
0.80
0.76
0.82
0.82
0.98
0.97
237.4
235.3
237.7
238.8
242.4
245.3
22.88
20.55
18.02
16.2
14.37
12.48
29.554
31.434
33.578
33.384
29.13
30.797
Jan-96
Feb-96
Mar-96
Apr-96
May-96
Jun-96
0.96
0.92
0.98
0.88
0.80
0.84
246.7
245.7
245.4
245.5
242.6
241.9
13.8
15.6
16.19
12.38
11.53
11.41
31.523
34.627
37.474
35.628
38.607
34.076
Jul-96
Aug-96
Sep-96
Oct-96
Nov-96
Dec-96
0.85
0.92
0.94
0.90
1.07
1.05
241.8
244.2
245.3
247.7
249.3
250.7
11.74
13.48
15.04
14.94
12.7
10.35
33.331
32.393
32.07
33.065
31.437
33.017
Continued
AHMAD ET AL.: NEURAL NETWORK FORECASTING
167
TABLE 2 Continued. Historical USDA data: egg price, number of hens, egg storage, and number of chickens
hatched
DATE
(month-yr)
EGG PRICE
($/dozen)
NUMBER OF HENS
(million)
EGG STORAGE
(million pounds)
CHICKENS HATCHED
(million)
Jan-97
Feb-97
Mar-97
Apr-97
May-97
Jun-97
0.92
0.89
0.90
0.80
0.76
0.72
250.5
248.5
249.5
248
246.7
244.9
10.19
11.04
11.47
8.55
8.49
8.37
33.331
35.318
37.648
38.746
38.391
36.955
Jul-97
Aug-97
Sep-97
Oct-97
Nov-97
Dec-97
0.86
0.77
0.88
0.81
1.03
0.97
243.3
242.9
245.1
249.7
251.1
255.6
8.59
9.16
11.1
10.9
10.92
10.2
33.954
32.903
35.794
35.175
27.803
FIGURE 2. Back propagation neural network (BPNN)—predicted output is independent of the historical data and
is produced by the trained neural network.
ceeds, the ANN becomes better and, eventually,
poorer at predicting the output values of the
calibration set. The mean squared error between
the actual and predicted output values are calculated by calibration, and the ANN is saved at the
point the calibration set is optimally predicted.
MATERIALS AND METHODS
DATA
Historic data of large-sized, white-shelled
eggs were used for the neural network training
in this research project. From 1993 to 1997,
TABLE 3. Summary statistics of a back propagation
neural network
Patterns processed
R2
r2
Mean squared error
Mean absolute error
Minimum absolute error
Maximum absolute error
Correlation coefficient r
Percentage within 5%
Percentage within 5 to 10%
Percentage within 10 to 20%
Percentage within 20 to 30%
Percentage over 30%
48
0.2682
0.6099
0.007
0.062
0.002
0.222
0.7809
45.833
31.25
16.667
6.25
0
JAPR: Research Report
168
TABLE 4. Test examples of egg quote data from 1997
TRAINING EXPERIMENT
INPUT 1
INPUT 2
INPUT 3
INPUT 4
OUTPUT
1/7/1997
1/14/1997
1/21/1997
1/28/1997
2/4/1997
2/11/1997
2/18/1997
2/25/1997
0.85
0.85
0.91
0.98
0.99
0.98
0.9
0.84
0.85
0.91
0.98
0.99
0.98
0.9
0.84
0.84
0.91
0.98
0.99
0.98
0.9
0.84
0.84
0.83
0.98
0.99
0.98
0.9
0.84
0.84
0.83
0.86
0.99
0.98
0.9
0.84
0.84
0.83
0.86
0.93
3/4/1997
3/11/1997
3/18/1997
3/25/1997
4/1/1997
4/8/1997
4/15/1997
4/22/1997
0.84
0.83
0.86
0.93
0.98
0.9
0.77
0.76
0.83
0.86
0.93
0.98
0.9
0.8
0.76
0.76
0.86
0.93
0.98
0.9
0.8
0.77
0.76
0.76
0.93
0.98
0.9
0.8
0.77
0.76
0.76
0.76
0.98
0.9
0.8
0.77
0.76
0.76
0.76
0.76
4/29/1997
5/6/1997
5/13/1997
5/20/1997
5/27/1997
6/3/1997
6/10/1997
6/17/1997
0.76
0.76
0.76
0.76
0.75
0.75
0.71
0.69
0.76
0.76
0.76
0.75
0.71
0.71
0.69
0.71
0.76
0.76
0.75
0.71
0.69
0.69
0.71
0.75
0.76
0.75
0.71
0.69
0.71
0.71
0.75
0.77
0.75
0.71
0.69
0.71
0.75
0.75
0.77
0.81
6/24/1997
7/1/1997
7/8/1997
7/15/1997
7/22/1997
7/29/1997
8/5/1997
8/12/1997
0.71
0.75
0.77
0.81
0.88
0.95
0.91
0.8
0.75
0.77
0.81
0.88
0.95
0.91
0.8
0.75
0.77
0.81
0.88
0.95
0.91
0.8
0.75
0.74
0.81
0.88
0.95
0.91
0.8
0.75
0.74
0.78
0.88
0.95
0.91
0.8
0.75
0.74
0.78
0.85
8/19/1997
8/26/1997
9/2/1997
9/9/1997
9/16/1997
9/23/1997
9/30/1997
10/7/1997
0.75
0.74
0.78
0.85
0.89
0.9
0.88
0.81
0.74
0.78
0.85
0.89
0.9
0.88
0.81
0.78
0.78
0.85
0.89
0.9
0.88
0.81
0.78
0.78
0.85
0.89
0.9
0.88
0.81
0.78
0.78
0.8
0.89
0.9
0.88
0.81
0.78
0.78
0.8
0.87
10/14/1997
10/21/1997
10/28/1997
11/4/1997
11/11/1997
11/18/1997
11/25/1997
12/2/1997
0.78
0.78
0.8
0.87
0.96
1.04
1.05
1.05
0.78
0.8
0.87
0.96
1.04
1.05
1.05
1.01
0.8
0.87
0.96
1.04
1.05
1.05
1.01
0.98
0.87
0.96
1.04
1.05
1.05
1.01
0.98
0.96
0.96
1.04
1.05
1.05
1.01
0.98
0.96
0.91
Urner Barry [11] quotes of the southeast US
were collected from Urner Barry’s 1998 reports.
The data for number of hens, eggs placed for
hatching, and egg storage capacity were col-
lected from the poultry yearbook, USDA database [12]. The daily egg quotes were averaged
for weekly and monthly rates and are given in
Table 1, and the USDA data are given in Table 2.
AHMAD ET AL.: NEURAL NETWORK FORECASTING
NEURAL NETWORK
To develop a viable and logical neural network model, various approaches were tried, but
only one is reported here. A neural network
model can be developed employing different architecture and by manipulating data sets within
such architecture. Initially the three variables,
number of hens, egg storage capacity, and chickens hatched (independent variables), were used
with egg quote (dependent variable) as three
inputs and one output for a back propagation
neural network (BPNN). The results, however,
were not satisfactory in terms of forecasting efficiency and were not reported here.
Alternatively, only the egg quote data were
used as the sole variable. To recognize all possible patterns within the egg quote data set, we
manipulated this data set to create its own inputs
and output, rather than taking along the other
variables. Starting from the first week of January
1993, the first 4 wk of data were taken as inputs
and the fifth week of data was output, which
constituted one training example. The second
training example consisted of second, third,
fourth, and fifth weeks as inputs, whereas the
sixth week was output. This process was iterated
169
until all of the weeks from January 1993 to
December 1996 were included as inputs and
outputs in all the training examples for a total
of 312 examples. The networks, using Neuro
Shell 2威 [5] were trained on these sets until
no further improvements were noticed in the
network. At that point the training was stopped.
The egg quote data from 1997 were used as
test case to observed the validity of the training
of network. The 1997 data were organized in a
similar fashion to training, for a total of 48 test
examples. Two different architectures namely,
BPNN and GRNN, were used on the test data,
and the results were compared for the best-fit
lines and statistical analysis.
RESULTS AND DISCUSSION
The BPNN results are presented in Figure 2
along with the statistical summary in Table 3.
These results were generated using the 48 patterns (epoch) of the test samples given in Table
4. The BPNN produced outputs (prediction)
based upon the neural network that was previously trained on the historical data given in
Table 2. The results of the training BPNN are
not shown.
FIGURE 3. General regression neural network (GRNN)—predicted output is independent of the historical data
and is produced by the trained neural network.
JAPR: Research Report
170
TABLE 5. Summary statistics of general regression
neural network
Patterns processed
Smoothing factor
R2
r2
Mean squared error
Mean absolute error
Minimum absolute error
Maximum absolute error
Correlation coefficient r
Percentage within 5%
Percentage within 5 to 10%
Percentage within 10 to 20%
Percentage within 20 to 30%
Percentage over 30%
48
0.033294
0.7312
0.756
0.003
0.038
0.001
0.15
0.8695
64.583
27.083
8.333
0
0
Although 0.268 R2 of BPNN is relatively
low, it should be kept in mind that this R2 is
generated out of egg price data only, without
the trouble of collecting additional data. In that
sense the BPNN was more efficient, as it not
only predicted a better-fit line but also avoided
the additional efforts in terms of time and resources to collect the relevant supporting data
for egg price forecasting.
Figure 3 shows the GRNN results along summary statistics in Table 5. Compared to the
BPNN, GRNN showed more promising results
with a lower mean squared error of 0.003 versus
0.007 of BPNN. Coefficient of correlation was
0.87 compared to 0.78 of BPNN. Moreover, R2
was 0.73 compared to 0.27 of BPNN. The
GRNN predicted response was in close proximity to the actual response, indicating a better
generalization of the training data in GRNN
than BPNN.
Although the software has a built-in validation procedure, we used a separate data set for
our own validation of the trained neural networks. Once the final neural work model was
generated and the predicted numerical results
were produced, we compared these results with
the actual data set for which these predictions
were intended, that is, 1998 weekly egg quotes.
Weekly egg quote data from 1993 to 1996 provided the data sets for neural network training,
and the 1997 egg quotes were used for validation. The results were produced with the 1997
data. These predicted results were then actually
compared with the 1998 data set and were found
to be satisfactory. This study is exploratory; although we believe that there is a great potential
for such an approach to accurately forecast the
future egg price, further research is needed not
only on the methodology but on the market validation too.
CONCLUSIONS AND APPLICATIONS
1. Linear regression models even with timely collected, related data may not account for all
variations in egg price.
2. The back propagation neural network recognized the patterns in the data more efficiently and
produced a better-fitted line for the predicted egg price.
3. The general regression neural network provided more accurate predictions than back propagation
neural network.
4. Reliable data collection and proper manipulation of such data remains the prerequisite for any
successful neural network model.
5. These results have a potential to be successfully implemented in the poultry industry for future
price forecasting and predictions if proper training of data manipulation and software is provided
to the concerned personnel.
REFERENCES AND NOTES
1. Oguri, K., H. Adachi, C. Yi, and Y. Cho, 1992. Study
on egg price forecasting in Jpn. Res. Bull. Fac. Agric. Gifu Univ.
57:157–164.
3. Schrader, L. F., Agriculture Economics, Math Building,
Purdue University, West Lafayette, IN 47907. [email protected]. Personal communication.
2. Bell, D. D., Cooperative Extension, University of California,
Highlander Hall, Riverside, CA 92521. Personal communication.
4. Fausett, L., 1994. Fundamentals of Neural Networks: Architectures, Algorithms, and Applications. Prentice-Hall, Inc., Upper
Saddle River, NJ 07458.
AHMAD ET AL.: NEURAL NETWORK FORECASTING
5. Ward Systems Group, 1993. Neuroshell 2威 Users Manual.
Ward Systems Group, Inc., Fredrick, MD. http://www.wardsystems.com.
6. Baxt, W.A.G., 1995. Application of artificial neural networks
to clinical medicine. Lancet 346:1134–1138.
7. Das, K., and M.D. Evans, 1992. Detecting fertility of
hatching eggs using machine vision II: Neural Network classifiers. Trans. ASAE 35:2035–2041.
8. Deck, S., C.T. Morrow, P.H. Heinemann, and H.J. Summer, III, 1992. Neural networks for automated inspection of produce.
American Society of Agricultural Engineers Paper No. 923594.
American Society of Agricultural Engineers, St. Joseph, MI.
171
9. Roush, W.B., Y.K. Kirby, T.L. Cravener, and R.F. Wideman, Jr., 1996. Artificial neural network prediction of ascites. Poult.
Sci. 75:1479–1487.
10. Roush, W.B., and T.L. Cravener, 1997. Artificial neural
network prediction of amino acid level in feed ingredients. Poult.
Sci. 76:721–727.
11. Egg prices 1998. Urner Barry Publications, Inc., Tom River,
New Jersery. http://www.urnerbarry.com. Accessed April 27, 2001.
12. Poultry Yearbook, USDA, Washington, DC. http://www.ers.USDA.gov/briefing/poultry. Accessed April 27, 2001.