SlideShare ist ein Scribd-Unternehmen logo
1 von 6
Downloaden Sie, um offline zu lesen
Onwukwe Ijioma
International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1
A Stochastic Iteration Method for A Class of Monotone
Variational Inequalities In Hilbert Space
Onwukwe Ijioma onwukweijioma@abiastateuniversity.edu.ng
Department of Mathematics
Abia state University, Uturu,
Abia State, Nigeria.
Abstract
We examined a general method for obtaining a solution to a class of monotone
variational inequalities in Hilbert space. be a
continuous linear monotone operator and be a non empty closed convex subset of . From
an initial arbitrary point We proposed and obtained iterative method that converges in
norm to a solution of the class of monotone variational inequalities. A stochastic scheme is
defined as follows: is a strong stochastic
approximation of .
Key words: Linear Monotone Operator, Hilbert Space, Stochastic Approximation.
1. INTRODUCTION
Many researchers over the years has studied the monotone variational inequalities as a result of
its relevance to the many disciplines such partial differential equations, optimization, optimal
control and finance.
We will assume that is Hilbert space with a complete inner product with associated
induced norm .
Let be a nonempty, closed and convex subset of . Let the strong convergence and weak
convergence of
A map
(1)
(2)
If there exists an increasing function
(3)
it implies that is uniformly convex.
Consider the continuous linear monotone operator
(4)
given on a closed convex subset .
Onwukwe Ijioma
International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1
The importance of the solution of (4) is its application in many areas where problems are reduced
to determining solutions of equilibrium problems.
In the case of where , it reduces to the fixed point problems where is
the nearest point projection map from for some arbitrary positive fixed constant .
2. PRELIMINARIES
In the work of Zhou H., Yu Zhou and Guanghui Feng, (see[11]), the authors examined the
general regularization method for seeking a solution to a class of monotone variational
inequalities in Hilbert space, where the regularizer is a hemi continuous and strongly monotone
operator.
Again, in the work of Lud, X. and Yang, J. (see [12]), the authors consider the monotone inverse
variational inequalities in a nonempty closed convex subset of a real Hilbert space. The authors
showed a general regularization method for monotone inverse variational inequalities, where the
regularizer is a Lipschitz continuous and strongly monotone mapping which converges strongly to
a solution of the inverse variational inequality.
Our interest in this paper is to examine and fully present a stochastic iteration process which
converges strongly to the solution of (4).
Recall that equilibrium problems can be formulated as co-variational inequalities problem
involving a continuous linear monotone map.
Find a point (5)
The basic relationship existing between (4) and (5) is given by
Theorem 1: a linear continuous monotone operator, the
following statements are equivalent.
(i) satisfies the co-variational inequalities
(ii)
It is shown that if is continuous and strongly monotone, then
has a unique solution (see [1,2,3]).
Mann, Ishikawa and other iterative processes, have been traditionally devoted to the case where
possesses strong monotonicity properties. A case when is not necessarily uniform or strongly
monotone, some of the known applied iterative schemes such as Mann and Ishikawa iterative
processes do not generate a unique solution.
For lots of practical solutions, we are interested to obtain an approximation of from
experimental observations.
The introduction of uncertainty provides an accurate method of investigation which suggested the
stochastic approximation method.
This work is an extension to infinite-dimensional Hilbert space of the work in the Euclidean space
(see [4]).
A new formulation of the operator equation as a new uniquely solvable gradient operator
equation, which is approximated using a least squares approximation method.
Onwukwe Ijioma
International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1
Let be a real separable Hilbert space with inner product . Given a probability
space , a random vector in is a measurable mapping as defined on the probability
space with values in .
It is a known fact that the following equalities hold.
(6)
Recall also, that the metric projection from a real Hilbert space onto a nonempty closed, and
convex subset is defined as follows:
For each there exists a unique element with the property
(7)
(8)
LEMMA 1: Let be the metric project from onto a nonempty, bounded, closed and
convex subset of a real Hilbert space .
and (9)
(1) (10)
An operator is said to be
(a) Monotone if (11)
(b)
(12)
3. FORMULATION OF STOCHASTIC APPLICATION ALGORITHM
Let be the expectation operator. If , then
Given any linear continuous monotone map , there exists a continuous, convex and
everywhere Frechet differentiable scalar function , such that
(13)
The gradient mapping , uniquely determined for a given coincides with
. Finding such that [7],
is equivalent to finding the unique zero such that (14)
Given that is a monotone operator defined on Hilbert space into itself then
the iterative scheme
(15)
converges strongly to a zero of for a suitable .
Onwukwe Ijioma
International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1
We consider a stochastic approximation algorithm. The usual form of stochastic approximation
algorithm is a minimum point , of a function is of the form
(16)
Where is the usual probability law of is
approximation of the gradient , and is a sequence of positive scalars that decreases
to zero (see [5], [6] and [8]).
The important part of (16) is the gradient approximation . We applied objective function
measurements and requires the minimization of a continuous and convex function.
Given a sequence of random vectors , computed from different data points:
, for each such that approximates strongly and in mean square
in the sense that the estimated vectors are minimized as follows:
Definition 1: The sequence of random vectors , is strongly approximates if
(17)
and consistent with in mean square if:
(18)
where the data points.
4. CONVERGENCE OF THE STOCHASTIC APPROXIMATION ALGORITHM
Theorem 2: Let the sequence of positive numbers satisfies the conditions:
(a)
(b)
(c)
Given that is a sequence of random vectors satisfying (15) and (16), and then the stochastic
sequence in defined iteratively from by converges strongly
to a unique solution almost surely.
Proof: Let , then is a sequence of
independent random variables. From (15), thus the
sequence of partial sums is a Martingale.
But .
The convergence of the series in (Theorem 2(c)), implies . Thus
by Martingale convergence theorem (see [10]), we have
, so that
.
It is obvious that the stochastic analog of (15) under suitable hypotheses, behaves asymptotically
as (15), and it implies the convergence properties of (15) are preserved when is replaced by
Onwukwe Ijioma
International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1
. It follows that the stochastic iteration scheme: converges strongly to unique
solution .
From (18) strongly approximates . Furthermore, it can be established that
for all least-square approximations. Then the stochastic
approximation algorithm scheme generated by the stochastic sequence is then given as
follows:
Let be an estimate of
(a) Compute
(b) Compute
(c) Compute
(d) Check for convergence:
Is
Yes:
No: Set and return to (a)
5. CONCLUSION
We have examined a stochastic iteration method that converges strongly to the solution of
variational inequalities problems in Hilbert space. This approach is similar to that in the finite
dimensional case, but developed for the problems in the Hilbert space setting.
The inequality problem is reformulated to a root finding problem of determining the zero of a
monotone operator in the Hilbert space.
However, the classical deterministic methods for approximating the zeros of monotone operators
are effective for a large problem sets. The stochastic method is able to handle many of the
problems for which deterministic method are inappropriate.
Further research area and possible extension is the regularity of existence of the solution. The
regularity solution is of great importance to crucial part in the derivation of the error estimate for
the finite-element approximations of variational inequalities. The regularity problem for the
general and quasi variational inequalities is still an area for future work.
6. REFERENCES
[1] C.E. Chidume, “Steepest Descent Approximation for Accretive Operator equations”
Nonlinear Anal. Theory and Methods Appl. 26, pp. 299-311.,1996.
[2] H.K.Xu and T.H.Kim. “Convergence of Hybrid Steepest descent methods for Variational
inequalities” Journal of Optimization Theory and Application, vol.119, No.1, pp. 185-201.,
2003.
[3] Z.Xu and G.Y.Roach. “A Necessary and Sufficient Condition for Convergence of Steepest
Descent Approximation to Accretive Operator Equation.” J. Math Anal. Appl.167, pp. 340-
354.,1992.
[4] A.C. Okoroafor. “A Strong Convergence Stochastic Algorithms for the global solution of
Autonomous Differential Equations in R
n
.” J. Applied Science 6(11), pp 2506-2509, 2006.
Onwukwe Ijioma
International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1
[5] O. IJIOMA and A.C. Okoroafor. “ A Stochastic Approximation Method Applied To A
Walrasian Equilibrium Model” IJAM Vol 5, No.1,pp 35-44, 2012.
[6] M.A. Kouritzin. “On The Convergence of Linear Stochastic Convergence Procedures.” IEEE
trans. Inform. Theory, vol. 42, pp. 1305., 1996.
[7] R.T. Rockafellar. “On the Maximal monotonicity of Sub differential mappings”, Pacific J.
Math., 33, pp. 209-216, 1970.
[8] H.Walk and L.Zsido. “Convergence of Robbins-Monro Method for Linear Problems in a
Banach space.”,J.Math. Analysis and Appl., vol. 135, pp. 152-177, 1989.
[9] L. Ljung, G. Pfug and H.Walk. “ Stochastic Approximation and Optimization of Random
Systems.”,Boston, MA: Birkhauser, 1992.
[10] P.Whitte. “ Probability”, John Wiley and Sons, London, 1976.
[11] H.Zhou, Y.Zhou and G. Feng. “ Iterative Methods For Solving a Class Of Monotone
Variational Inequality Problems with Applications” , Journal of Inequalities and Applications
vol 68 (2015), 2015. (https//doi.org/10.1186/s13660-015-0590-y).
[12] X. Lud, and J.Yang. “ Regularization and Iterative Methods for Monotone Inverse Variational
Inequalities” Optim Lett.8, pp 1261-1272, 2014. (https:/doi.org/10.1007/s11590-013-0653-2).

Weitere ähnliche Inhalte

Was ist angesagt?

Simple linear regression analysis
Simple linear  regression analysisSimple linear  regression analysis
Simple linear regression analysis
Norma Mingo
 
Regression analysis ppt
Regression analysis pptRegression analysis ppt
Regression analysis ppt
Elkana Rorio
 

Was ist angesagt? (20)

Econometrics chapter 5-two-variable-regression-interval-estimation-
Econometrics chapter 5-two-variable-regression-interval-estimation-Econometrics chapter 5-two-variable-regression-interval-estimation-
Econometrics chapter 5-two-variable-regression-interval-estimation-
 
Finding the Extreme Values with some Application of Derivatives
Finding the Extreme Values with some Application of DerivativesFinding the Extreme Values with some Application of Derivatives
Finding the Extreme Values with some Application of Derivatives
 
Mathematical analysis
Mathematical analysisMathematical analysis
Mathematical analysis
 
Dw34752755
Dw34752755Dw34752755
Dw34752755
 
Regression analysis algorithm
Regression analysis algorithm Regression analysis algorithm
Regression analysis algorithm
 
International Journal of Computational Engineering Research(IJCER)
International Journal of Computational Engineering Research(IJCER)International Journal of Computational Engineering Research(IJCER)
International Journal of Computational Engineering Research(IJCER)
 
Simple linear regression analysis
Simple linear  regression analysisSimple linear  regression analysis
Simple linear regression analysis
 
201977 1-1-1-pb
201977 1-1-1-pb201977 1-1-1-pb
201977 1-1-1-pb
 
Regression analysis ppt
Regression analysis pptRegression analysis ppt
Regression analysis ppt
 
Varaiational formulation fem
Varaiational formulation fem Varaiational formulation fem
Varaiational formulation fem
 
Seattle.Slides.7
Seattle.Slides.7Seattle.Slides.7
Seattle.Slides.7
 
Statistics-Regression analysis
Statistics-Regression analysisStatistics-Regression analysis
Statistics-Regression analysis
 
An Analysis and Study of Iteration Procedures
An Analysis and Study of Iteration ProceduresAn Analysis and Study of Iteration Procedures
An Analysis and Study of Iteration Procedures
 
Numerical Study of Some Iterative Methods for Solving Nonlinear Equations
Numerical Study of Some Iterative Methods for Solving Nonlinear EquationsNumerical Study of Some Iterative Methods for Solving Nonlinear Equations
Numerical Study of Some Iterative Methods for Solving Nonlinear Equations
 
Linear models for data science
Linear models for data scienceLinear models for data science
Linear models for data science
 
Tbs910 regression models
Tbs910 regression modelsTbs910 regression models
Tbs910 regression models
 
Cost indexes
Cost indexesCost indexes
Cost indexes
 
Doubly Decomposing Nonparametric Tensor Regression
Doubly Decomposing Nonparametric Tensor RegressionDoubly Decomposing Nonparametric Tensor Regression
Doubly Decomposing Nonparametric Tensor Regression
 
Ds33717725
Ds33717725Ds33717725
Ds33717725
 
Fractional pseudo-Newton method and its use in the solution of a nonlinear sy...
Fractional pseudo-Newton method and its use in the solution of a nonlinear sy...Fractional pseudo-Newton method and its use in the solution of a nonlinear sy...
Fractional pseudo-Newton method and its use in the solution of a nonlinear sy...
 

Ähnlich wie A Stochastic Iteration Method for A Class of Monotone Variational Inequalities In Hilbert Space

1-s2.0-S0047259X16300689-main (1).pdf
1-s2.0-S0047259X16300689-main (1).pdf1-s2.0-S0047259X16300689-main (1).pdf
1-s2.0-S0047259X16300689-main (1).pdf
shampy kamboj
 
MMath Paper, Canlin Zhang
MMath Paper, Canlin ZhangMMath Paper, Canlin Zhang
MMath Paper, Canlin Zhang
canlin zhang
 

Ähnlich wie A Stochastic Iteration Method for A Class of Monotone Variational Inequalities In Hilbert Space (20)

Characteristics and simulation analysis of nonlinear correlation coefficient ...
Characteristics and simulation analysis of nonlinear correlation coefficient ...Characteristics and simulation analysis of nonlinear correlation coefficient ...
Characteristics and simulation analysis of nonlinear correlation coefficient ...
 
Hopf-Bifurcation Ina Two Dimensional Nonlinear Differential Equation
Hopf-Bifurcation Ina Two Dimensional Nonlinear Differential  EquationHopf-Bifurcation Ina Two Dimensional Nonlinear Differential  Equation
Hopf-Bifurcation Ina Two Dimensional Nonlinear Differential Equation
 
Unger
UngerUnger
Unger
 
Stability behavior of second order neutral impulsive stochastic differential...
Stability behavior of second order neutral impulsive  stochastic differential...Stability behavior of second order neutral impulsive  stochastic differential...
Stability behavior of second order neutral impulsive stochastic differential...
 
Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...
Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...
Multiple Linear Regression Model with Two Parameter Doubly Truncated New Symm...
 
International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)
 
Mohammad Sabawi ICETS-2018 Presentation
Mohammad Sabawi ICETS-2018 PresentationMohammad Sabawi ICETS-2018 Presentation
Mohammad Sabawi ICETS-2018 Presentation
 
Numerical solutions for linear fredholm integro differential
 Numerical solutions for linear fredholm integro differential  Numerical solutions for linear fredholm integro differential
Numerical solutions for linear fredholm integro differential
 
A General Relativity Primer
A General Relativity PrimerA General Relativity Primer
A General Relativity Primer
 
Stochastic Model to Find the Diagnostic Reliability of Gallbladder Ejection F...
Stochastic Model to Find the Diagnostic Reliability of Gallbladder Ejection F...Stochastic Model to Find the Diagnostic Reliability of Gallbladder Ejection F...
Stochastic Model to Find the Diagnostic Reliability of Gallbladder Ejection F...
 
International Journal of Mathematics and Statistics Invention (IJMSI)
International Journal of Mathematics and Statistics Invention (IJMSI) International Journal of Mathematics and Statistics Invention (IJMSI)
International Journal of Mathematics and Statistics Invention (IJMSI)
 
Averaging Schemes For Solving Fixed Point And Variational Inequality Problems
Averaging Schemes For Solving Fixed Point And Variational Inequality ProblemsAveraging Schemes For Solving Fixed Point And Variational Inequality Problems
Averaging Schemes For Solving Fixed Point And Variational Inequality Problems
 
1-s2.0-S0047259X16300689-main (1).pdf
1-s2.0-S0047259X16300689-main (1).pdf1-s2.0-S0047259X16300689-main (1).pdf
1-s2.0-S0047259X16300689-main (1).pdf
 
I1802026068
I1802026068I1802026068
I1802026068
 
Metaheuristic Optimization: Algorithm Analysis and Open Problems
Metaheuristic Optimization: Algorithm Analysis and Open ProblemsMetaheuristic Optimization: Algorithm Analysis and Open Problems
Metaheuristic Optimization: Algorithm Analysis and Open Problems
 
MMath Paper, Canlin Zhang
MMath Paper, Canlin ZhangMMath Paper, Canlin Zhang
MMath Paper, Canlin Zhang
 
SPDE
SPDE SPDE
SPDE
 
The reading on noncommutative speaker of EQuaLS 8
The reading on noncommutative speaker of EQuaLS 8The reading on noncommutative speaker of EQuaLS 8
The reading on noncommutative speaker of EQuaLS 8
 
A Class of Continuous Implicit Seventh-eight method for solving y’ = f(x, y) ...
A Class of Continuous Implicit Seventh-eight method for solving y’ = f(x, y) ...A Class of Continuous Implicit Seventh-eight method for solving y’ = f(x, y) ...
A Class of Continuous Implicit Seventh-eight method for solving y’ = f(x, y) ...
 
Sustainable arbitrage based on long-term memory via SEV
Sustainable arbitrage based on long-term memory via SEVSustainable arbitrage based on long-term memory via SEV
Sustainable arbitrage based on long-term memory via SEV
 

Kürzlich hochgeladen

Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
vu2urc
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
Earley Information Science
 

Kürzlich hochgeladen (20)

Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
GenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdfGenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdf
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 

A Stochastic Iteration Method for A Class of Monotone Variational Inequalities In Hilbert Space

  • 1. Onwukwe Ijioma International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1 A Stochastic Iteration Method for A Class of Monotone Variational Inequalities In Hilbert Space Onwukwe Ijioma onwukweijioma@abiastateuniversity.edu.ng Department of Mathematics Abia state University, Uturu, Abia State, Nigeria. Abstract We examined a general method for obtaining a solution to a class of monotone variational inequalities in Hilbert space. be a continuous linear monotone operator and be a non empty closed convex subset of . From an initial arbitrary point We proposed and obtained iterative method that converges in norm to a solution of the class of monotone variational inequalities. A stochastic scheme is defined as follows: is a strong stochastic approximation of . Key words: Linear Monotone Operator, Hilbert Space, Stochastic Approximation. 1. INTRODUCTION Many researchers over the years has studied the monotone variational inequalities as a result of its relevance to the many disciplines such partial differential equations, optimization, optimal control and finance. We will assume that is Hilbert space with a complete inner product with associated induced norm . Let be a nonempty, closed and convex subset of . Let the strong convergence and weak convergence of A map (1) (2) If there exists an increasing function (3) it implies that is uniformly convex. Consider the continuous linear monotone operator (4) given on a closed convex subset .
  • 2. Onwukwe Ijioma International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1 The importance of the solution of (4) is its application in many areas where problems are reduced to determining solutions of equilibrium problems. In the case of where , it reduces to the fixed point problems where is the nearest point projection map from for some arbitrary positive fixed constant . 2. PRELIMINARIES In the work of Zhou H., Yu Zhou and Guanghui Feng, (see[11]), the authors examined the general regularization method for seeking a solution to a class of monotone variational inequalities in Hilbert space, where the regularizer is a hemi continuous and strongly monotone operator. Again, in the work of Lud, X. and Yang, J. (see [12]), the authors consider the monotone inverse variational inequalities in a nonempty closed convex subset of a real Hilbert space. The authors showed a general regularization method for monotone inverse variational inequalities, where the regularizer is a Lipschitz continuous and strongly monotone mapping which converges strongly to a solution of the inverse variational inequality. Our interest in this paper is to examine and fully present a stochastic iteration process which converges strongly to the solution of (4). Recall that equilibrium problems can be formulated as co-variational inequalities problem involving a continuous linear monotone map. Find a point (5) The basic relationship existing between (4) and (5) is given by Theorem 1: a linear continuous monotone operator, the following statements are equivalent. (i) satisfies the co-variational inequalities (ii) It is shown that if is continuous and strongly monotone, then has a unique solution (see [1,2,3]). Mann, Ishikawa and other iterative processes, have been traditionally devoted to the case where possesses strong monotonicity properties. A case when is not necessarily uniform or strongly monotone, some of the known applied iterative schemes such as Mann and Ishikawa iterative processes do not generate a unique solution. For lots of practical solutions, we are interested to obtain an approximation of from experimental observations. The introduction of uncertainty provides an accurate method of investigation which suggested the stochastic approximation method. This work is an extension to infinite-dimensional Hilbert space of the work in the Euclidean space (see [4]). A new formulation of the operator equation as a new uniquely solvable gradient operator equation, which is approximated using a least squares approximation method.
  • 3. Onwukwe Ijioma International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1 Let be a real separable Hilbert space with inner product . Given a probability space , a random vector in is a measurable mapping as defined on the probability space with values in . It is a known fact that the following equalities hold. (6) Recall also, that the metric projection from a real Hilbert space onto a nonempty closed, and convex subset is defined as follows: For each there exists a unique element with the property (7) (8) LEMMA 1: Let be the metric project from onto a nonempty, bounded, closed and convex subset of a real Hilbert space . and (9) (1) (10) An operator is said to be (a) Monotone if (11) (b) (12) 3. FORMULATION OF STOCHASTIC APPLICATION ALGORITHM Let be the expectation operator. If , then Given any linear continuous monotone map , there exists a continuous, convex and everywhere Frechet differentiable scalar function , such that (13) The gradient mapping , uniquely determined for a given coincides with . Finding such that [7], is equivalent to finding the unique zero such that (14) Given that is a monotone operator defined on Hilbert space into itself then the iterative scheme (15) converges strongly to a zero of for a suitable .
  • 4. Onwukwe Ijioma International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1 We consider a stochastic approximation algorithm. The usual form of stochastic approximation algorithm is a minimum point , of a function is of the form (16) Where is the usual probability law of is approximation of the gradient , and is a sequence of positive scalars that decreases to zero (see [5], [6] and [8]). The important part of (16) is the gradient approximation . We applied objective function measurements and requires the minimization of a continuous and convex function. Given a sequence of random vectors , computed from different data points: , for each such that approximates strongly and in mean square in the sense that the estimated vectors are minimized as follows: Definition 1: The sequence of random vectors , is strongly approximates if (17) and consistent with in mean square if: (18) where the data points. 4. CONVERGENCE OF THE STOCHASTIC APPROXIMATION ALGORITHM Theorem 2: Let the sequence of positive numbers satisfies the conditions: (a) (b) (c) Given that is a sequence of random vectors satisfying (15) and (16), and then the stochastic sequence in defined iteratively from by converges strongly to a unique solution almost surely. Proof: Let , then is a sequence of independent random variables. From (15), thus the sequence of partial sums is a Martingale. But . The convergence of the series in (Theorem 2(c)), implies . Thus by Martingale convergence theorem (see [10]), we have , so that . It is obvious that the stochastic analog of (15) under suitable hypotheses, behaves asymptotically as (15), and it implies the convergence properties of (15) are preserved when is replaced by
  • 5. Onwukwe Ijioma International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1 . It follows that the stochastic iteration scheme: converges strongly to unique solution . From (18) strongly approximates . Furthermore, it can be established that for all least-square approximations. Then the stochastic approximation algorithm scheme generated by the stochastic sequence is then given as follows: Let be an estimate of (a) Compute (b) Compute (c) Compute (d) Check for convergence: Is Yes: No: Set and return to (a) 5. CONCLUSION We have examined a stochastic iteration method that converges strongly to the solution of variational inequalities problems in Hilbert space. This approach is similar to that in the finite dimensional case, but developed for the problems in the Hilbert space setting. The inequality problem is reformulated to a root finding problem of determining the zero of a monotone operator in the Hilbert space. However, the classical deterministic methods for approximating the zeros of monotone operators are effective for a large problem sets. The stochastic method is able to handle many of the problems for which deterministic method are inappropriate. Further research area and possible extension is the regularity of existence of the solution. The regularity solution is of great importance to crucial part in the derivation of the error estimate for the finite-element approximations of variational inequalities. The regularity problem for the general and quasi variational inequalities is still an area for future work. 6. REFERENCES [1] C.E. Chidume, “Steepest Descent Approximation for Accretive Operator equations” Nonlinear Anal. Theory and Methods Appl. 26, pp. 299-311.,1996. [2] H.K.Xu and T.H.Kim. “Convergence of Hybrid Steepest descent methods for Variational inequalities” Journal of Optimization Theory and Application, vol.119, No.1, pp. 185-201., 2003. [3] Z.Xu and G.Y.Roach. “A Necessary and Sufficient Condition for Convergence of Steepest Descent Approximation to Accretive Operator Equation.” J. Math Anal. Appl.167, pp. 340- 354.,1992. [4] A.C. Okoroafor. “A Strong Convergence Stochastic Algorithms for the global solution of Autonomous Differential Equations in R n .” J. Applied Science 6(11), pp 2506-2509, 2006.
  • 6. Onwukwe Ijioma International Journal of Scientific and Statistical Computing (IJSSC), Volume (8) : Issue (1) : 2020 1 [5] O. IJIOMA and A.C. Okoroafor. “ A Stochastic Approximation Method Applied To A Walrasian Equilibrium Model” IJAM Vol 5, No.1,pp 35-44, 2012. [6] M.A. Kouritzin. “On The Convergence of Linear Stochastic Convergence Procedures.” IEEE trans. Inform. Theory, vol. 42, pp. 1305., 1996. [7] R.T. Rockafellar. “On the Maximal monotonicity of Sub differential mappings”, Pacific J. Math., 33, pp. 209-216, 1970. [8] H.Walk and L.Zsido. “Convergence of Robbins-Monro Method for Linear Problems in a Banach space.”,J.Math. Analysis and Appl., vol. 135, pp. 152-177, 1989. [9] L. Ljung, G. Pfug and H.Walk. “ Stochastic Approximation and Optimization of Random Systems.”,Boston, MA: Birkhauser, 1992. [10] P.Whitte. “ Probability”, John Wiley and Sons, London, 1976. [11] H.Zhou, Y.Zhou and G. Feng. “ Iterative Methods For Solving a Class Of Monotone Variational Inequality Problems with Applications” , Journal of Inequalities and Applications vol 68 (2015), 2015. (https//doi.org/10.1186/s13660-015-0590-y). [12] X. Lud, and J.Yang. “ Regularization and Iterative Methods for Monotone Inverse Variational Inequalities” Optim Lett.8, pp 1261-1272, 2014. (https:/doi.org/10.1007/s11590-013-0653-2).