%Paper: 
%From: DEFELICE@BARI.INFN.IT
%Date: Thu, 27 JAN 94 19:26 WET

\def\bold#1{\setbox0=\hbox{$#1$}%
     \kern-.025em\copy0\kern-\wd0
     \kern.05em\copy0\kern-\wd0
     \kern-.025em\raise.0433em\box0 }
\def\slash#1{\setbox0=\hbox{$#1$}#1\hskip-\wd0\dimen0=5pt\advance
       \dimen0 by-\ht0\advance\dimen0 by\dp0\lower0.5\dimen0\hbox
         to\wd0{\hss\sl/\/\hss}}
\documentstyle[12pt]{article}
\renewcommand{\topfraction}{1.0}
\renewcommand{\bottomfraction}{1.0}
\renewcommand{\textfraction}{0.0}
\newlength{\dinwidth}
\newlength{\dinmargin}
\setlength{\dinwidth}{21.0cm}
\textheight25cm \textwidth16.0cm
\setlength{\dinmargin}{\dinwidth}
\addtolength{\dinmargin}{-\textwidth}
\setlength{\dinmargin}{0.5\dinmargin}
\oddsidemargin -1.0in
\addtolength{\oddsidemargin}{\dinmargin}
\setlength{\evensidemargin}{\oddsidemargin}
\setlength{\marginparwidth}{0.9\dinmargin}
\marginparsep 8pt \marginparpush 5pt
\topmargin -42pt
\headheight 12pt
\headsep 30pt \footheight 12pt \footskip 24pt
\parskip 3mm plus 2mm minus 2mm
%\pagestyle{empty}
\voffset=-1.0truecm
%\renewcommand{\baselinestretch}{1.5}
\newcommand{\resection}[1]{\setcounter{equation}{0}\section{#1}}
\newcommand{\appsectio}{\setcounter{section}{0}
         \addtocounter{section}{1} \setcounter{equation}{0}
                         \section*{Appendix \Alph{section}}}
\newcommand{\appsection}{\addtocounter{section}{1} \setcounter{equation}{0}
                         \section*{Appendix \Alph{section}}}
\renewcommand{\theequation}{\thesection.\arabic{equation}}
%\setcounter{page}{0}
\begin{document}
\def\lq{\left [}
\def\rq{\right ]}
\def\LL{{\cal L}}
\def\VV{{\cal V}}
\def\AA{{\cal A}}

\newcommand{\be}{\begin{equation}}
\newcommand{\ee}{\end{equation}}
\newcommand{\bea}{\begin{eqnarray}}
\newcommand{\eea}{\end{eqnarray}}
\newcommand{\nn}{\nonumber}
\newcommand{\dd}{\displaystyle}

\thispagestyle{empty}
\vspace*{4cm}
\begin{center}
  \begin{Large}
  \begin{bf} HIGGS SEARCH BY NEURAL NETWORKS AT LHC\\
  \end{bf}
  \end{Large}
  \vspace{1cm}
  \begin{large}
P. Chiappetta\\
  \end{large}
{\it Centre de Physique Th\'eorique, C.N.R.S. Luminy, France}\\
  \vspace{8mm}
  \begin{large}
P.Colangelo$^1$, P. De Felice$^{1,2}$, G. Nardulli$^{1,2}$\\
  \end{large}
$^1${\it I.N.F.N., Sezione di Bari}\\
$^2${\it Dipartimento di Fisica, Universit\'a
di Bari, Italy}\\
  \vspace{8mm}
  \begin{large}
G. Pasquariello\\
  \end{large}
{\it Istituto Elaborazione Segnale Immagini, C.N.R., Bari, Italy}
  \vspace{5mm}
\end{center}
  \vspace{2cm}
\begin{center}
CPT-93/PE 2969 \\
BARI-TH/159-93 \\
November 1993\\
\end{center}
\vspace{1cm}
\begin{quotation}
\begin{center}
  \begin{Large}
  \begin{bf}
  ABSTRACT
  \end{bf}
  \end{Large}
\end{center}
  \vspace{5mm}
\noindent
We show that neural network classifiers can be used to discriminate Higgs
production from background at LHC for $ 150< M_H<200$ GeV. The results
compare favourably with ordinary multivariate analysis.
\end{quotation}
\newpage
\setcounter{page}{1}
\resection{Introduction}
\par
Experimental data accumulated so far, and especially the LEP results,
strongly support the Standard Model (hereafter denoted as SM) as the
theory of the fundamental interactions at the presently
available energies. Nevertheless the verification
of its validity has to be completed since the top quark and Higgs
boson have not been discovered yet.

As well known, the value of the
Higgs mass is not predictable, but there are indications, arising from the
limits of applicability of the perturbation theory or violation of unitarity,
that it should not exceed
800 GeV \cite{lus}. If Higgs particles below $\approx 1$ TeV are not
discovered, other strong forces could be at work, as predicted by the
Technicolor \cite{tec} scheme, which however, at least in its minimal version,
is not
favored by LEP data \cite{alt}; in this and
other similar approaches the strongly interacting
scalar sector might be revealed by the presence of new vector
bosons \cite{cas} and some light on the electroweak symmetry breaking
could be shed by longitudinal boson scattering. Another theoretical extension
of SM is provided by Supersymmetry (SUSY) (for reviews see\cite{sus}),
which, as
well known, naturally solves the {\it hierarchy problem}
because boson and fermion loops contributions to scalar masses have
opposite signs in SUSY and tend to cancel out, thus avoiding
Higgs masses of the order of the Grand Unification scale $M_{GUT}\sim 10^{16}$
GeV. In the sequel, however, we shall consider only the SM Higgs also
because its search is the first motivation for the future
high energy hadron colliders, and also since one of the SUSY Higgs particles
exhibits a similar behaviour.
\par
The present upper limit on the SM Higgs mass coming from LEP is $63$ GeV
\cite{lep}. Therefore, since
the LEP 200 discovery limit is around $80-90$ GeV, more energetic colliders
are mandatory to pin down the mechanism of the electroweak symmetry
breaking.
\par
In this letter we will consider experiments aiming to discover the SM Higgs at
the future Large Hadron Collider (LHC) which is planned for the end of this
century at CERN.
Within the Standard Model
the discovery of the Higgs particle should
be complicated
by the presence of huge backgrounds. Considerable effort has been
provided by the Aachen workshop of the LHC study groups to clarify this issue,
 and we refer
the interested reader to the Proceedings \cite{lhc} of that Conference for
a comprehensive survey. Our aim here is to analyze
the possibility to use a
neural network (NN) classifier as a tool for a better discrimination
between signal and  background and to evaluate the relative performance
between the neural trigger and traditional statistical methods such as
the multivariate analysis. Given the limits of the present
work we shall not consider the whole Higgs mass range nor we
study all the possible Higgs decay channels, but we shall limit ourselves
to some specific case studies. More precisely we shall analyze
the Higgs mass range $150 - 200$  GeV
and study the  decay into four muons,
which, as shown by the above mentioned LHC  study groups, seems to be the
most favourable decay channel for Higgs discovery.

We first discuss in section 2 a possible choice of the
physical observables useful for
the separation of the Higgs signal from background;
these observables are the input variables
for the NN classifier that is described in the same section.
We present our results and discuss the relative performance
of the NN method and multivariate analysis in section 3. Finally,
in section 4 we draw our conclusions.
\bigskip
\resection{Physical observables and the neural network}
At hadron colliders the dominant mechanism for Higgs production, in the
intermediate mass range we are interested in, is gluon-gluon
fusion. The best decay channel for identification is two real
$Z^0$ bosons for Higgs mass $M_H \ge 2 M_Z$
or, for $M_H \le 2 M_Z$, one real and one off-shell $Z^0$,
followed by their leptonic decays. LHC studies have shown that this
channel is the most efficient one for $130 \le M_H \le 800$ GeV; we shall
consider here the case $M_H\, = \, 150$ GeV where the presence
of one virtual
$Z^0$ renders the analysis more demanding and the case
$M_H\, = \, 200$ GeV just above the threshold for production of
two real $Z$'s. In this region the
top production comprises the most important background.
Usual ways to reduce the background are lepton isolation, lepton pair mass
constraints around $Z$ mass and
a lepton detection threshold  around
$P_T^{\ell}
\simeq 10$ GeV \cite{lhc}. In this paper we do not impose any cut on physical
variables, but we simply choose a set of physical observables
whose values represent the entries of a neural classifier,
since we expect that, by an appropriate choice of such observables,
the discrimination should automatically occur as a result of the neural
classifier.

Before discussing the variables let us examine
the background in  more detail. Besides $t \bar t$ production
followed by top semileptonic decay to bottom and $b$ semileptonic decay, one
expects other sources of background, most notably
$Zb \bar b$; however, as shown by the LHC study groups, this
process is expected to contribute by around only one third of $t \bar t \to
4 \mu$ to the cross section;  therefore we shall neglect it at this stage
because in this letter we are more interested in
a study of the relative performance of the NN and the multivariate analysis
rather than a comprehensive study of the background. In any event we do not
expect
a significant change of the results, should these other minor
backgrounds be included.

On the other hand  several other background processes
become important if one does not impose cuts on lepton transverse
momenta; to take into account them, together with the dominant
$t \bar t \to
4 \mu X$,
we shall include as background processes all the events where four
muons are in the final state and a $t \bar t$ pair has been produced,
without forcing their decay. We have checked, by simulating the events
by the Pythia Montecarlo\cite{pyt}, that these background events produce
$\sigma \times BR \simeq 11 pb$, which is larger by a factor of about $25$ as
compared to $\sigma \times BR$ for $pp \to t \bar t X \to 4 \mu X$,
choosing a top mass of 130 GeV \cite{ffr}.

Let us now list the physical observables we have used for
discrimination between background and signal.  We have considered
10 observables, that are:
\begin{description}
\item[$X_1$) - $X_4$)] the transverse momentum of the four muons. The two
 $\mu^+$ and the two $\mu^-$  can be ordered according to
their energies.
As expected, the distributions of these variables for background
events, as simulated by the Pythia Montecarlo, show a maximum close
to zero, while the
signal distributions show a peak around 25 -- 50 GeV,
\par\noindent
\item[$X_5$) - $X_8$)]The  invariant masses of the four
different $\mu^+ \mu^-$ pairs. Also these pairs can be ordered according
to the lepton energies.
These distributions for signal events show a peak around the $Z^0$ mass
which are absent for the background. The peaks arise from events where two
muons come
from the real $Z^0$; they are present in all the 4 variables since the
ordering based on the
energy mixes in part the muons coming from the two $Z^0$,
\par\noindent
\item[ $X_9$)] The four muons invariant mass,
\item[$X_{10}$) ] The hadron multiplicity.
\end{description}

This choice of variables is mainly based on kinematical considerations and
should be considered as {\it minimal}, since we expect that
other dynamical
variables, besides the hadron multiplicity, can improve the performance
of the network. We shall come back to this point later on.

The physical observables $X_j$ discussed above, once normalized to the interval
$[0 - 1]$, become the inputs $x_j \; (j=1,...n)$
of our neural network classifier. We
employ the most common architecture used for high energy applications (see,
e.g. \cite{nn}), i.e.
the {\it feed-forward} neural network; in our case it
comprises one input layer with
 $n=10$ neurons $x_j$, one layer with $2n+1$ hidden neurons $z_j$
and one output unit $y$. We employ the {\it backpropagation} algorithm
\cite{rum} to train the network. The events are divided into two sets,
the training and the testing set. Each event $p$ of the training set
consists of the array $x_j$ of the input variables and the value $y$ of the
output neuron ($y=0$ or $1$ for the event describing the signal, i.e.
Higgs production, or the background, respectively). At each time
step and for any pattern $p$ in the training set, the algorithm
modifies the synaptic
couplings giving the strength of the interaction between the hidden
layer and the output neuron:
\be
W_{i} \; \rightarrow \; W_{i} \; + \; \Delta W^{(p)}_{i}
\ee
\par\noindent
with
\be
\Delta W^{(p)}_{i}  \; =\; -\lambda {\partial E^{(p)} \over
\partial W_{i} } \; + \; \alpha \Delta W^{(old)}_{i} \; ,
\ee
\par\noindent
where
\be
E^{(p)}= {1 \over 2} (y^{(p)} -t^{(p)})^2
\ee

In the previous equation,
for any pattern $p$ in the training set, $t^{(p)}$ is
the expected {\it target} ($t^{(p)}= 0$ or $1$, for signal and
background event respectively) and $y$ is given by:
\be
y \; = \; g(u, \theta) \;
\ee
\par\noindent
where the transfer function $g(u,\theta)$ is as follows:
\be
g(u,\theta) \; = \; {1 \over 1 + \exp (- {u - \theta \over T })}
\ee
and $u$ is given in terms of the hidden variables by $u \; =
\sum_{l=1}^{2n+1} W_{l} \, z_l$.
Similar relations hold among the hidden variables $z_k$ and the input
neurons $x_j$, so that
the repeated application of Eqs. (2.1) and (2.2)
fixes the couplings $W_{lk}$  between the hidden neurons $z_k$ and the
input neurons $x_l$ as well. In our simulations we use
the values $\alpha =0.9$, $\lambda=1$ and $T=1$ for the network parameters.

\resection{Results}
Our simulations have been obtained by
the Pythia Montecarlo Code \cite{pyt}. We have considered two masses
of the Higgs particle: one below $2 M_Z$ i.e. $150$ GeV, and
one just above, i.e. $200$ GeV. The
top quark mass has been put equal to
130 GeV (increasing $m_t$ improves the
results since it reduces the background).
The simulated events have been divided into two sets,
the training set, used by the network to learn, and the testing set,
used to test the performance of the NN.
The training set consists of an equal number $N \,=\,  5,000 $
of background and signal events:
we have checked that the results are stable against changes of $N$.
On the other hand in the testing set the populations of the two samples,
background and signal,
have been taken different; as a matter of fact, we have considered
2,000 signal $pp \to H X \to 4 \mu X$
events, independently of the Higgs mass, while
one has to take $1.1 \times 10^7$ and $4.3 \times 10^6$ background events
for the two cases of $M_H \, = \, 150$ and $200$ GeV respectively.
The ratios between signal and background cross section
that we use are computed in the Standard Model by the Pythia Montecarlo.

If the sample is statistically significant one may consider a smaller set of
background events and rescale the final results (i.e. the number of
misinterpreted background events) according to the predictions of the
Standard Model. We have checked that this procedure works already with 50,000
background events.

We have considered four different cases in our simulations.
In the first case the 10 variables have been
used with no cuts; the number of simulated events quoted above refers
to this case. In the second case, in
order to increase the ratio signal/background, we have considered
only muons with the $4 \, \mu$ invariant mass
in the range $M_H \,\pm 10$ GeV. In these simulations we have obtained
slightly
worse results, i.e. the discrimination seems more
difficult; moreover, in this case, the training phase
lasts longer.

In order to determine the most suitable variables for this kind of study,
we have also repeated the analysis by using only five variables. We have
considered two of these cases, one with $ x_1,\, x_2,\, x_3,\, x_4$ and
$x_9$ as input variables and the other with input variables
$x_5,\, x_6, \, x_7,\, x_8,\, $ and $x_9$.
In both these cases the results are slightly worse as compared
to the choice of 10 variables. Therefore, in the evaluation of
the performance of the NN we shall refer
to the first of the four cases we have discussed, i.e. 10 variables
with no cut on $M_{\mu \mu \mu \mu}$.

The performance of the NN classifier can
be assessed by introducing two
variables: the purity ($P$) and the efficiency ($\eta$) defined as follows:
\be
P \, =\, \frac{N^a_H}{N^a_H + N^a_B}
\ee
and
\be
\eta \, = \, \frac{N^a_H}{N_H}
\ee
where $N_H$ is the total number of Higgs events in the testing sample, $N^a_H$
is the total number of the accepted
(i.e.
correctly identified)
Higgs events and
$N^a_B$ is the total number of the
accepted background events,
i.e. events that are incorrectly identified as
Higgs events.

One can increase the purity decreasing the efficiency
by introducing a
threshold parameter $l \epsilon [0,1]$ as follows.
The range of values of the output
neuron $y^{(p)}$ in the testing phase is divided into
the subintervals: $I_1 =
[0,l]$ and $I_2 = ]l,1]$, so that if $y^{(p)} \epsilon
I_1$ (respectively  $y^{(p)} \epsilon I_2$) the event is
classified as signal  (respectively: background).
Clearly, by taking $l$ sufficiently small, one increases purity. For example
at $M_H \, = \, 150$ GeV one obtains $P=0.1$ for $l \approx 10^{-3}$ and
$P=0.25$ for $l \approx 0.5 \times 10^{-4}$.

Our results are reported in Fig. 1, which shows that,
as expected, the case with $M_H \, = \, 200$ GeV is certainly
more favourable than the case with $M_H \, = \, 150$ GeV.  Fig. 1 also shows
that in both cases one can reach appreciable values of purity,
even though, especially for low Higgs mass, the reduction of efficiency
is relevant.


Let us now compare these results with the maximum
likelihood method.
First of all, as a general comment, we observe that the neural network
is more flexible than the multivariate analysis, since in the former case
by an appropriate choice of the parameter $l$ one can increase the purity
without limitations, at least in principle. This flexibility is absent
in the multivariate analysis because this method only uses averages
and not a uniform fit to the data as the NN does.
When the comparison is possible, i.e. for values of
the efficiency larger then $30\%$, the maximum likelihood method
gives results that are significantly worse. By way of example, at
$M_H \, = \, 150$ GeV with an efficiency of $85\%$
the traditional method gives a purity of 0.02 (a factor of 3 worse
than the NN result of Fig. 1);
with an efficiency of $99\%$
the traditional method gives a purity of 0.01, again worse than NN.
Similar results are obtained with $M_H \, = 200$ GeV: for example,
the
efficiency of $34\%$ corresponds, with
the traditional method, to a  purity of 0.07 , while NN
gives a purity of roughly 0.35 at the same value of efficiency.

\resection{Conclusions}

Our results show that NN can be of some help in the difficult task of
discriminating background events from the signal in the Higgs search
at the future Large Hadron Collider to be built at CERN. We have
proved this by considering one particular Higgs decay channel ($H \to 4 \mu$)
in the Higgs mass range (150-200) GeV and including the
most relevant background. We are conscious of the limits of the present
analysis: for example other sources of background should be included and
different Montecarlo's might be employed to test the independence
of the results from the theoretical inputs. Moreover
other global variables, similar to
the hadron multiplicity and sensitive to
the infrared structure of the QCD radiation could be introduced,
even though the use of preprocessed
observables might limit the whole range
of possibilities of the neural trigger. We plan to perform these
analysis in a subsequent paper. On the other hand, from our experience on
a similar subject \cite{mar} we do not expect a dependence of the results
on the architecture of the neural network. We feel,
therefore, that we have correctly addressed the main point
i.e. the comparison between the neural network
and the usual multivariate analysis
based on the maximum likelihood method.
Our results show that NN
compare favourably with the traditional
statistical analysis.
Needless to say that NN have another
clear advantage over traditional statistical methods, since they
can support a high degree of parallelism and could be used for
on-line analysis of the experimental data. Therefore their
use in the future LHC experiments should be seriously considered
and thoroughly investigated.
\vskip 1cm

{\bf Acknowledgements.} We wish to thank G. Marchesini for several
helpful comments on the subject of this work and M.C. Cousinou, S. Basa and
C.Marangi for useful discussions.


\newpage
\begin{thebibliography}{99}

\bibitem{lus} M. Luscher and P. Weisz Nucl. Phys. {\bf B290}
(1987) 5; {\bf B295} (1988) 65 and {\bf B318} (1989) 705.

\bibitem{tec} E. Farhi and L. Susskind, Phys. Rep. {\bf 74} (1981) 77.


\bibitem{alt} G. Altarelli, to appear in the Proceedings of EPS High Energy
Physics Conference, Marseille July 1993.

\bibitem{cas} R. Casalbuoni, S. De Curtis, D. Dominici and R. Gatto,
Nucl. Phys. {\bf B 282} (1987) 235;
R. Casalbuoni, P. Chiappetta, D. Dominici, F. Feruglio and R. Gatto,
Phys. Lett. {\bf B 269} (1991) 361.

\bibitem{sus} J. Ellis in {\it Ten Years of SUSY Confronting Experiment}
CERN-TH. 6707/92.

\bibitem{lep} G. Giacomelli and P. Giacomelli, CERN-PPE/93-107,
to be published in Riv. Nuovo Cimento.

\bibitem{lhc} D. Froidevaux, in Proc. of {\it Large Hadron Collider Workshop},
Eds. G. Jarlskog and D. Rein, CERN 90-10 and ECFA 90-133, Vol. II, pag. 444;
A. Nisati, ibid. pag. 492; M. Della Negra et al., ibid. pag. 509.


\bibitem{pyt} H. U. Bengtsson and T. Sj\"ostrand, Computer
Physics Commun. {\bf 46} (1987) 43; T. Sj\"ostrand, CERN-TH.6488/92.

\bibitem{ffr} D. Froidevaux, R. Hawkings, L. Poggioli, L. Serin, R.St.Denis,
Update of Results on Intermediate Mass Higgs, report (18 Nov. 1992).

\bibitem{nn} L. L\"onnblad, C. Petersen and T.
R\"ognvaldsson, Nucl. Phys. {\bf B349} (1991) 675;
C. Bortolotto, A. De Angelis and L. Lanceri, Nucl.
Inst. and Methods {\bf A306} (1991) 457;
L. Bellantoni et al., Nucl. Inst. and Methods {\bf A310} (1991) 618.

\bibitem{rum}
D. E. Rumelhart, G. E. Hinton and R. J. Williams, in Parallel
Distributed Processing: Explorations in the Microstructure of
Cognition, MIT Press, Cambridge MA (1986).

\bibitem{mar} G. Marchesini, G. Nardulli and G. Pasquariello, Nucl. Phys.
{\bf B 394} (1993) 541.

\end{thebibliography}
\newpage

\begin{center}
  \begin{Large}
  \begin{bf}
  Figure Caption
  \end{bf}
  \end{Large}
\end{center}
  \vspace{5mm}
\begin{description}
\item [Fig. 1] The purity $P$ versus the Higgs efficiency
$\eta$ for two different sets of data: $M_H \, = \, 150$ and
200 GeV (lower and upper line respectively).
\end{description}
\end{document}
\bye
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%                                   FIGURE 1
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%!PS-Adobe-2.0
%%Creator: HIGZ Version 1.12/09
%%CreationDate: 27/01/94   11.13
%%EndComments
/l {lineto} def /m {moveto} def /t {translate} def
/sw {stringwidth} def /r {rotate} def /rl {roll} def
/d {rlineto} def /rm {rmoveto} def /gr {grestore} def /f {eofill} def
/bl {m dup 0 exch d exch 0 d 0 exch neg d cl s} def
/bf {m dup 0 exch d exch 0 d 0 exch neg d cl f} def
/lw {setlinewidth} def /sd {setdash} def
/s {stroke} def /c {setrgbcolor} def
/cl {closepath} def /sf {scalefont setfont} def
/oshow {gsave [] 0 sd true charpath stroke gr} def
/cs {gsave dup sw pop 2 div neg 5 -1 rl 5 -1 rl t 3 -1 rl r 0 m show gr} def
/mk {s 0 360 arc f} def
/rs {gsave dup sw pop neg 5 -1 rl 5 -1 rl t 3 -1 rl r 0 m show gr} def
/ocs {gsave dup sw pop 2 div neg 5 -1 rl 5 -1 rl t  3 -1 rl r 0 m oshow gr} def
/ors {gsave dup sw pop neg 5 -1 rl 5 -1 rl t 3 -1  rl r 0 m oshow gr} def
 /reencsmalldict 24 dict def /ReEncodeSmall
{reencsmalldict begin /newcodesandnames exch def /newfontname exch def
/basefontname exch def /basefontdict basefontname findfont def
 /newfont basefontdict maxlength dict def basefontdict
{exch dup /FID ne {dup /Encoding eq {exch dup length array copy
newfont 3 1 roll put} {exch newfont 3 1 roll put} ifelse}
{pop pop} ifelse } forall
newfont /FontName newfontname put newcodesandnames aload pop
newcodesandnames length 2 idiv {newfont /Encoding get 3 1 roll put} repeat
newfontname newfont definefont pop end } def /accvec [
8#260 /agrave 8#265 /Agrave 8#276 /acircumflex 8#300 /Acircumflex
8#311 /adieresis 8#314 /Adieresis 8#321 /ccedilla 8#322 /Ccedilla
8#323 /eacute 8#324 /Eacute 8#325 /egrave 8#326 /Egrave
8#327 /ecircumflex 8#330 /Ecircumflex 8#331 /edieresis 8#332 /Edieresis
8#333 /icircumflex 8#334 /Icircumflex 8#335 /idieresis 8#336 /Idieresis
8#337 /ntilde 8#340 /Ntilde 8#342 /ocircumflex 8#344 /Ocircumflex
8#345 /odieresis 8#346 /Odieresis 8#347 /ucircumflex 8#354 /Ucircumflex
8#355 /udieresis 8#356 /Udieresis 8#357 /aring 8#362 /Aring
8#363 /ydieresis 8#364 /Ydieresis 8#366 /aacute 8#367 /Aacute
8#374 /ugrave 8#375 /Ugrave] def
/Times-Roman /Times-Roman accvec ReEncodeSmall
/Times-Italic /Times-Italic accvec ReEncodeSmall
/Times-Bold /Times-Bold accvec ReEncodeSmall
/Times-BoldItalic /Times-BoldItalic accvec ReEncodeSmall
/Helvetica /Helvetica accvec ReEncodeSmall
/Helvetica-Oblique /Helvetica-Oblique accvec ReEncodeSmall
/Helvetica-Bold /Helvetica-Bold accvec ReEncodeSmall
/Helvetica-BoldOblique /Helvetica-BoldOblique  accvec ReEncodeSmall
/Courier /Courier accvec ReEncodeSmall
/Courier-Oblique /Courier-Oblique accvec ReEncodeSmall
/Courier-Bold /Courier-Bold accvec ReEncodeSmall
/Courier-BoldOblique /Courier-BoldOblique accvec ReEncodeSmall
 0.00000 0.00000 0.00000 c gsave .25 .25 scale 90 90 translate
% End of Initialisation
 1 lw [] 0 sd 1754 1754 219 219 bl 219 219 m 1973 219 l s 219 219 m 219 1973 l
s
 252 512 m 219 512 l s 236 581 m 219 581 l s 236 650 m 219 650 l s 236 719 m
219
 719 l s 236 788 m 219 788 l s 252 857 m 219 857 l s 236 927 m 219 927 l s 236
 996 m 219 996 l s 236 1065 m 219 1065 l s 236 1134 m 219 1134 l s 252 1203 m
 219 1203 l s 236 1272 m 219 1272 l s 236 1341 m 219 1341 l s 236 1410 m 219
 1410 l s 236 1479 m 219 1479 l s 252 1548 m 219 1548 l s 236 1617 m 219 1617 l
 s 236 1686 m 219 1686 l s 236 1755 m 219 1755 l s 236 1824 m 219 1824 l s 252
 1893 m 219 1893 l s 252 512 m 219 512 l s 236 443 m 219 443 l s 236 374 m 219
 374 l s 236 305 m 219 305 l s 236 236 m 219 236 l s 252 1893 m 219 1893 l s
236
 1962 m 219 1962 l s 115 528 m 111 526 l 108 522 l 107 514 l 107 510 l 108 503
l
 111 498 l 115 497 l 118 497 l 123 498 l 126 503 l 127 510 l 127 514 l 126 522
l
 123 526 l 118 528 l 115 528 l cl s 139 500 m 137 498 l 139 497 l 140 498 l 139
 500 l cl s 152 520 m 152 522 l 153 525 l 155 526 l 158 528 l 164 528 l 167 526
 l 168 525 l 170 522 l 170 519 l 168 516 l 165 512 l 151 497 l 171 497 l s 115
 873 m 111 871 l 108 867 l 107 860 l 107 855 l 108 848 l 111 844 l 115 842 l
118
 842 l 123 844 l 126 848 l 127 855 l 127 860 l 126 867 l 123 871 l 118 873 l
115
 873 l cl s 139 845 m 137 844 l 139 842 l 140 844 l 139 845 l cl s 165 873 m
151
 852 l 172 852 l s 165 873 m 165 842 l s 115 1218 m 111 1217 l 108 1212 l 107
 1205 l 107 1200 l 108 1193 l 111 1189 l 115 1187 l 118 1187 l 123 1189 l 126
 1193 l 127 1200 l 127 1205 l 126 1212 l 123 1217 l 118 1218 l 115 1218 l cl s
 139 1190 m 137 1189 l 139 1187 l 140 1189 l 139 1190 l cl s 170 1214 m 168
1217
 l 164 1218 l 161 1218 l 156 1217 l 153 1212 l 152 1205 l 152 1198 l 153 1192 l
 156 1189 l 161 1187 l 162 1187 l 167 1189 l 170 1192 l 171 1196 l 171 1198 l
 170 1202 l 167 1205 l 162 1206 l 161 1206 l 156 1205 l 153 1202 l 152 1198 l s
 115 1563 m 111 1562 l 108 1557 l 107 1550 l 107 1546 l 108 1538 l 111 1534 l
 115 1533 l 118 1533 l 123 1534 l 126 1538 l 127 1546 l 127 1550 l 126 1557 l
 123 1562 l 118 1563 l 115 1563 l cl s 139 1535 m 137 1534 l 139 1533 l 140
1534
 l 139 1535 l cl s 158 1563 m 153 1562 l 152 1559 l 152 1556 l 153 1553 l 162
 1550 l 167 1549 l 170 1546 l 171 1543 l 171 1538 l 170 1535 l 168 1534 l 164
 1533 l 158 1533 l 153 1534 l 152 1535 l 151 1538 l 151 1543 l 152 1546 l 155
 1549 l 159 1550 l 168 1553 l 170 1556 l 170 1559 l 168 1562 l 164 1563 l 158
 1563 l cl s 155 1903 m 158 1904 l 162 1908 l 162 1878 l s 219 219 m 1973 219 l
 s 386 252 m 386 219 l s 422 236 m 422 219 l s 457 236 m 457 219 l s 492 236 m
 492 219 l s 527 236 m 527 219 l s 563 252 m 563 219 l s 598 236 m 598 219 l s
 633 236 m 633 219 l s 669 236 m 669 219 l s 704 236 m 704 219 l s 739 252 m
739
 219 l s 774 236 m 774 219 l s 810 236 m 810 219 l s 845 236 m 845 219 l s 880
 236 m 880 219 l s 916 252 m 916 219 l s 951 236 m 951 219 l s 986 236 m 986
219
 l s 1021 236 m 1021 219 l s 1057 236 m 1057 219 l s 1092 252 m 1092 219 l s
 1127 236 m 1127 219 l s 1163 236 m 1163 219 l s 1198 236 m 1198 219 l s 1233
 236 m 1233 219 l s 1268 252 m 1268 219 l s 1304 236 m 1304 219 l s 1339 236 m
 1339 219 l s 1374 236 m 1374 219 l s 1410 236 m 1410 219 l s 1445 252 m 1445
 219 l s 1480 236 m 1480 219 l s 1515 236 m 1515 219 l s 1551 236 m 1551 219 l
s
 1586 236 m 1586 219 l s 1621 252 m 1621 219 l s 1657 236 m 1657 219 l s 1692
 236 m 1692 219 l s 1727 236 m 1727 219 l s 1763 236 m 1763 219 l s 1798 252 m
 1798 219 l s 386 252 m 386 219 l s 351 236 m 351 219 l s 316 236 m 316 219 l s
 280 236 m 280 219 l s 245 236 m 245 219 l s 1798 252 m 1798 219 l s 1833 236 m
 1833 219 l s 1868 236 m 1868 219 l s 1904 236 m 1904 219 l s 1939 236 m 1939
 219 l s 363 197 m 358 196 l 356 191 l 354 184 l 354 180 l 356 172 l 358 168 l
 363 167 l 366 167 l 370 168 l 373 172 l 375 180 l 375 184 l 373 191 l 370 196
l
 366 197 l 363 197 l cl s 386 170 m 385 168 l 386 167 l 388 168 l 386 170 l cl
s
 402 191 m 405 193 l 410 197 l 410 167 l s 539 197 m 535 196 l 532 191 l 531
184
 l 531 180 l 532 172 l 535 168 l 539 167 l 542 167 l 547 168 l 550 172 l 551
180
 l 551 184 l 550 191 l 547 196 l 542 197 l 539 197 l cl s 563 170 m 561 168 l
 563 167 l 564 168 l 563 170 l cl s 576 190 m 576 191 l 577 194 l 579 196 l 582
 197 l 588 197 l 590 196 l 592 194 l 593 191 l 593 189 l 592 186 l 589 181 l
574
 167 l 595 167 l s 716 197 m 711 196 l 708 191 l 707 184 l 707 180 l 708 172 l
 711 168 l 716 167 l 719 167 l 723 168 l 726 172 l 727 180 l 727 184 l 726 191
l
 723 196 l 719 197 l 716 197 l cl s 739 170 m 738 168 l 739 167 l 741 168 l 739
 170 l cl s 754 197 m 770 197 l 761 186 l 765 186 l 768 184 l 770 183 l 771 178
 l 771 175 l 770 171 l 767 168 l 763 167 l 758 167 l 754 168 l 752 170 l 751
172
 l s 892 197 m 888 196 l 885 191 l 883 184 l 883 180 l 885 172 l 888 168 l 892
 167 l 895 167 l 900 168 l 902 172 l 904 180 l 904 184 l 902 191 l 900 196 l
895
 197 l 892 197 l cl s 916 170 m 914 168 l 916 167 l 917 168 l 916 170 l cl s
942
 197 m 927 177 l 949 177 l s 942 197 m 942 167 l s 1069 197 m 1064 196 l 1061
 191 l 1060 184 l 1060 180 l 1061 172 l 1064 168 l 1069 167 l 1072 167 l 1076
 168 l 1079 172 l 1080 180 l 1080 184 l 1079 191 l 1076 196 l 1072 197 l 1069
 197 l cl s 1092 170 m 1091 168 l 1092 167 l 1093 168 l 1092 170 l cl s 1121
197
 m 1107 197 l 1105 184 l 1107 186 l 1111 187 l 1115 187 l 1120 186 l 1123 183 l
 1124 178 l 1124 175 l 1123 171 l 1120 168 l 1115 167 l 1111 167 l 1107 168 l
 1105 170 l 1104 172 l s 1245 197 m 1241 196 l 1238 191 l 1236 184 l 1236 180 l
 1238 172 l 1241 168 l 1245 167 l 1248 167 l 1252 168 l 1255 172 l 1257 180 l
 1257 184 l 1255 191 l 1252 196 l 1248 197 l 1245 197 l cl s 1268 170 m 1267
168
 l 1268 167 l 1270 168 l 1268 170 l cl s 1299 193 m 1298 196 l 1293 197 l 1290
 197 l 1286 196 l 1283 191 l 1282 184 l 1282 177 l 1283 171 l 1286 168 l 1290
 167 l 1292 167 l 1296 168 l 1299 171 l 1301 175 l 1301 177 l 1299 181 l 1296
 184 l 1292 186 l 1290 186 l 1286 184 l 1283 181 l 1282 177 l s 1422 197 m 1417
 196 l 1414 191 l 1413 184 l 1413 180 l 1414 172 l 1417 168 l 1422 167 l 1424
 167 l 1429 168 l 1432 172 l 1433 180 l 1433 184 l 1432 191 l 1429 196 l 1424
 197 l 1422 197 l cl s 1445 170 m 1443 168 l 1445 167 l 1446 168 l 1445 170 l
cl
 s 1477 197 m 1462 167 l s 1457 197 m 1477 197 l s 1598 197 m 1594 196 l 1591
 191 l 1589 184 l 1589 180 l 1591 172 l 1594 168 l 1598 167 l 1601 167 l 1605
 168 l 1608 172 l 1610 180 l 1610 184 l 1608 191 l 1605 196 l 1601 197 l 1598
 197 l cl s 1621 170 m 1620 168 l 1621 167 l 1623 168 l 1621 170 l cl s 1640
197
 m 1636 196 l 1635 193 l 1635 190 l 1636 187 l 1645 184 l 1649 183 l 1652 180 l
 1654 177 l 1654 172 l 1652 170 l 1651 168 l 1646 167 l 1640 167 l 1636 168 l
 1635 170 l 1633 172 l 1633 177 l 1637 183 l 1642 184 l 1651 187 l 1652 190 l
 1652 193 l 1651 196 l 1646 197 l 1640 197 l cl s 1774 197 m 1770 196 l 1767
191
 l 1766 184 l 1766 180 l 1767 172 l 1770 168 l 1774 167 l 1777 167 l 1782 168 l
 1785 172 l 1786 180 l 1786 184 l 1785 191 l 1782 196 l 1777 197 l 1774 197 l
cl
 s 1798 170 m 1796 168 l 1798 167 l 1799 168 l 1798 170 l cl s 1828 187 m 1827
 183 l 1824 180 l 1820 178 l 1818 178 l 1814 180 l 1811 183 l 1810 187 l 1810
 189 l 1811 193 l 1814 196 l 1818 197 l 1820 197 l 1824 196 l 1827 193 l 1828
 187 l 1828 180 l 1827 172 l 1824 168 l 1820 167 l 1817 167 l 1812 168 l 1811
 171 l s 299 1893 m 299 1893 l 355 794 l 352 785 l 903 569 l 1103 585 l 1256
538
 l 1302 516 l 1401 509 l 1748 390 l 1798 347 l 1824 340 l 1847 337 l 1860 322 l
 1874 321 l 1883 315 l 1888 310 l 1893 299 l s 299 1893 m 299 1893 l 355 794 l
 352 785 l 903 569 l 1103 585 l 1256 538 l 1302 516 l 1401 509 l 1748 390 l
1798
 347 l 1824 340 l 1847 337 l 1860 322 l 1874 321 l 1883 315 l 1888 310 l 1893
 299 l s 299 1903 m 287 1883 l 310 1883 l 299 1903 l cl s 355 803 m 343 784 l
 366 784 l 355 803 l cl s 352 795 m 340 775 l 363 775 l 352 795 l cl s 903 579
m
 892 559 l 915 559 l 903 579 l cl s 1103 595 m 1091 575 l 1114 575 l 1103 595 l
 cl s 1256 548 m 1245 528 l 1268 528 l 1256 548 l cl s 1302 526 m 1290 506 l
 1314 506 l 1302 526 l cl s 1401 519 m 1389 499 l 1412 499 l 1401 519 l cl s
 1748 400 m 1737 380 l 1760 380 l 1748 400 l cl s 1798 356 m 1786 337 l 1809
337
 l 1798 356 l cl s 1824 350 m 1813 330 l 1836 330 l 1824 350 l cl s 1847 347 m
 1836 327 l 1859 327 l 1847 347 l cl s 1860 331 m 1848 312 l 1871 312 l 1860
331
 l cl s 1874 331 m 1862 311 l 1885 311 l 1874 331 l cl s 1883 325 m 1871 305 l
 1894 305 l 1883 325 l cl s 1888 320 m 1876 300 l 1899 300 l 1888 320 l cl s
 1893 309 m 1882 289 l 1905 289 l 1893 309 l cl s 299 1903 m 287 1883 l 310
1883
 l 299 1903 l cl s 355 803 m 343 784 l 366 784 l 355 803 l cl s 352 795 m 340
 775 l 363 775 l 352 795 l cl s 903 579 m 892 559 l 915 559 l 903 579 l cl s
 1103 595 m 1091 575 l 1114 575 l 1103 595 l cl s 1256 548 m 1245 528 l 1268
528
 l 1256 548 l cl s 1302 526 m 1290 506 l 1314 506 l 1302 526 l cl s 1401 519 m
 1389 499 l 1412 499 l 1401 519 l cl s 1748 400 m 1737 380 l 1760 380 l 1748
400
 l cl s 1798 356 m 1786 337 l 1809 337 l 1798 356 l cl s 1824 350 m 1813 330 l
 1836 330 l 1824 350 l cl s 1847 347 m 1836 327 l 1859 327 l 1847 347 l cl s
 1860 331 m 1848 312 l 1871 312 l 1860 331 l cl s 1874 331 m 1862 311 l 1885
311
 l 1874 331 l cl s 1883 325 m 1871 305 l 1894 305 l 1883 325 l cl s 1888 320 m
 1876 300 l 1899 300 l 1888 320 l cl s 1893 309 m 1882 289 l 1905 289 l 1893
309
 l cl s 375 1893 m 375 1893 l 444 1380 l 474 1154 l 564 983 l 605 933 l 753 778
 l 1096 740 l 1373 719 l 1501 657 l 1549 649 l 1676 602 l 1708 557 l 1761 540 l
 1777 533 l 1794 521 l 1800 514 l 1808 502 l 1814 498 l 1817 492 l 1824 486 l
 1849 460 l 1861 450 l 1868 438 l 1886 428 l s 375 1903 m 369 1893 l 375 1883 l
 381 1893 l 375 1903 l cl s 444 1390 m 439 1380 l 444 1371 l 450 1380 l 444
1390
 l cl s 474 1164 m 469 1154 l 474 1144 l 480 1154 l 474 1164 l cl s 564 993 m
 559 983 l 564 974 l 570 983 l 564 993 l cl s 605 943 m 599 933 l 605 924 l 611
 933 l 605 943 l cl s 753 788 m 747 778 l 753 768 l 759 778 l 753 788 l cl s
 1096 750 m 1090 740 l 1096 730 l 1101 740 l 1096 750 l cl s 1373 729 m 1367
719
 l 1373 710 l 1378 719 l 1373 729 l cl s 1501 667 m 1495 657 l 1501 647 l 1507
 657 l 1501 667 l cl s 1549 658 m 1543 649 l 1549 639 l 1555 649 l 1549 658 l
cl
 s 1676 612 m 1670 602 l 1676 592 l 1682 602 l 1676 612 l cl s 1708 567 m 1702
 557 l 1708 547 l 1714 557 l 1708 567 l cl s 1761 550 m 1755 540 l 1761 530 l
 1767 540 l 1761 550 l cl s 1777 543 m 1771 533 l 1777 523 l 1783 533 l 1777
543
 l cl s 1794 531 m 1788 521 l 1794 511 l 1800 521 l 1794 531 l cl s 1800 524 m
 1794 514 l 1800 504 l 1805 514 l 1800 524 l cl s 1808 512 m 1802 502 l 1808
492
 l 1814 502 l 1808 512 l cl s 1814 508 m 1808 498 l 1814 489 l 1820 498 l 1814
 508 l cl s 1817 501 m 1811 492 l 1817 482 l 1823 492 l 1817 501 l cl s 1824
496
 m 1818 486 l 1824 476 l 1830 486 l 1824 496 l cl s 1849 470 m 1843 460 l 1849
 451 l 1855 460 l 1849 470 l cl s 1861 460 m 1855 450 l 1861 440 l 1867 450 l
 1861 460 l cl s 1868 448 m 1862 438 l 1868 428 l 1874 438 l 1868 448 l cl s
 1886 438 m 1880 428 l 1886 418 l 1892 428 l 1886 438 l cl s 375 1903 m 369
1893
 l 375 1883 l 381 1893 l 375 1903 l cl s 444 1390 m 439 1380 l 444 1371 l 450
 1380 l 444 1390 l cl s 474 1164 m 469 1154 l 474 1144 l 480 1154 l 474 1164 l
 cl s 564 993 m 559 983 l 564 974 l 570 983 l 564 993 l cl s 605 943 m 599 933
l
 605 924 l 611 933 l 605 943 l cl s 753 788 m 747 778 l 753 768 l 759 778 l 753
 788 l cl s 1096 750 m 1090 740 l 1096 730 l 1101 740 l 1096 750 l cl s 1373
729
 m 1367 719 l 1373 710 l 1378 719 l 1373 729 l cl s 1501 667 m 1495 657 l 1501
 647 l 1507 657 l 1501 667 l cl s 1549 658 m 1543 649 l 1549 639 l 1555 649 l
 1549 658 l cl s 1676 612 m 1670 602 l 1676 592 l 1682 602 l 1676 612 l cl s
 1708 567 m 1702 557 l 1708 547 l 1714 557 l 1708 567 l cl s 1761 550 m 1755
540
 l 1761 530 l 1767 540 l 1761 550 l cl s 1777 543 m 1771 533 l 1777 523 l 1783
 533 l 1777 543 l cl s 1794 531 m 1788 521 l 1794 511 l 1800 521 l 1794 531 l
cl
 s 1800 524 m 1794 514 l 1800 504 l 1805 514 l 1800 524 l cl s 1808 512 m 1802
 502 l 1808 492 l 1814 502 l 1808 512 l cl s 1814 508 m 1808 498 l 1814 489 l
 1820 498 l 1814 508 l cl s 1817 501 m 1811 492 l 1817 482 l 1823 492 l 1817
501
 l cl s 1824 496 m 1818 486 l 1824 476 l 1830 486 l 1824 496 l cl s 1849 470 m
 1843 460 l 1849 451 l 1855 460 l 1849 470 l cl s 1861 460 m 1855 450 l 1861
440
 l 1867 450 l 1861 460 l cl s 1868 448 m 1862 438 l 1868 428 l 1874 438 l 1868
 448 l cl s 1886 438 m 1880 428 l 1886 418 l 1892 428 l 1886 438 l cl s 1098
125
 m 1099 128 l 1102 132 l 1105 132 l 1107 130 l 1107 127 l 1105 121 l 1102 110 l
 s 1105 121 m 1109 127 l 1112 130 l 1115 132 l 1118 132 l 1121 128 l 1121 124 l
 1119 116 l 1115 99 l s 22 1195 m 55 1195 l s 22 1195 m 22 1210 l 23 1214 l 25
 1216 l 28 1217 l 33 1217 l 36 1216 l 38 1214 l 39 1210 l 39 1195 l s
showpage grestore


