Blue brain enables humans to give new dimensions to science and technology and make enormous development in making the best possible enlightenment to the present scenario.the details can be seen by going though the power point presentation
We are taking the paper presentation of Blue brain technology concept here we include the definition of blue brain, How
it is possible? , Uploading human brain, what is virtual brain, function of a brain, advantages and disadvantages of blue brain
technology? With the advancement in technology, human, the ultimate source of information and discovery should also be preserved.
In other words, human is does not live for thousands of years but the information in his mind could be saved and used for several
thousands of years. So, even after the death of a person we will not lose the knowledge, intelligence, personalities, feelings and
memories of that man that can be used for the development of the human society.
The Blue Brain, a Swiss national brain initiative, aims to create a digital reconstruction of the brain by reverse-engineering mammalian brain circuitry. The mission of the project, founded in May 2005 by the Brain and Mind Institute of the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, is to use biologically-detailed digital reconstructions and simulations of the mammalian brain (brain simulation) to identify the fundamental principles of brain structure and function in health and disease.
It is said that within 30 years we will be able to scan ourselves into computers.
Blue brain enables humans to give new dimensions to science and technology and make enormous development in making the best possible enlightenment to the present scenario.the details can be seen by going though the power point presentation
We are taking the paper presentation of Blue brain technology concept here we include the definition of blue brain, How
it is possible? , Uploading human brain, what is virtual brain, function of a brain, advantages and disadvantages of blue brain
technology? With the advancement in technology, human, the ultimate source of information and discovery should also be preserved.
In other words, human is does not live for thousands of years but the information in his mind could be saved and used for several
thousands of years. So, even after the death of a person we will not lose the knowledge, intelligence, personalities, feelings and
memories of that man that can be used for the development of the human society.
The Blue Brain, a Swiss national brain initiative, aims to create a digital reconstruction of the brain by reverse-engineering mammalian brain circuitry. The mission of the project, founded in May 2005 by the Brain and Mind Institute of the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, is to use biologically-detailed digital reconstructions and simulations of the mammalian brain (brain simulation) to identify the fundamental principles of brain structure and function in health and disease.
It is said that within 30 years we will be able to scan ourselves into computers.
This presentation lists some brain-computer interface technologies that exist today and that could be attainable in future. At the end, philosophical comments about this kind of technology and transhumanism are purposed, in order to reveal the key difference between a humain brain and artificial intelligence.
With the introduction of Blue Brain technology, which is a reverse engineering, we can overcome all the brain disorders and diseases. Blue Brain is the name of the world’s first virtual brain which makes a machine, function as a human brain. Even after the death of the person the complete functional attribute of a human brain can be stored in that and can be used for further development.
Blue brain enables humans to give new dimensions to science and technology and make enormous development in making the best possible enlightenment to the present scenario.the details can be seen by going though the power point presentation
EEG Based BCI Applications with Deep LearningRiddhi Jain
Summarised a Survey Paper describing EEG Based BCI Applications and Sensing Technologies and their Computational Intelligence Approach published on Jan 28, 2020
ARTIFICIAL brain ......AI
Artificial brain (or artificial mind) is a term commonly used in the media[1] to describe research that aims to develop software and hardware with cognitive abilities similar to those of the animal or human brain. Research investigating "artificial brains" and brain emulation plays three important roles in science:
An ongoing attempt by neuroscientists to understand how the human brain works, known as cognitive neuroscience.
A thought experiment in the philosophy of artificial intelligence, demonstrating that it is possible, at least in theory, to create a machine that has all the capabilities of a human being.
A long term project to create machines exhibiting behavior comparable to those of animals with complex central nervous system such as mammals and most particularly humans. The ultimate goal of creating a machine exhibiting human-like behavior or intelligence is sometimes called strong AI.
An example of the first objective is the project reported by Aston University in Birmingham, England[2] where researchers are using biological cells to create "neurospheres" (small clusters of neurons) in order to develop new treatments for diseases including Alzheimer's, motor neurone and Parkinson's disease.
The second objective is a reply to arguments such as John Searle's Chinese room argument, Hubert Dreyfus' critique of AI or Roger Penrose's argument in The Emperor's New Mind. These critics argued that there are aspects of human consciousness or expertise that can not be simulated by machines. One reply to their arguments is that the biological processes inside the brain can be simulated to any degree of accuracy. This reply was made as early as 1950, by Alan Turing in his classic paper "Computing Machinery and Intelligence".[3]
The third objective is generally called artificial general intelligence by researchers.[4] However, Ray Kurzweil prefers the term "strong AI". In his book The Singularity is Near, he focuses on whole brain emulation using conventional computing machines as an approach to implementing artificial brains, and claims (on grounds of computer power continuing an exponential growth trend) that this could be done by 2025. Henry Markram, director of the Blue Brain project (which is attempting brain emulation), made a similar claim (2020) at the Oxford TED conference in 2009.
These slides use concepts from my (Jeff Funk) course entitled analyzing hi-tech opportunities to analyze how Neurosynaptic chips are becoming economic feasible for supercomputing applications. Neurosynaptic chips use a different architecture, one that mimics the brain with neurons and synapses. These neurons and synapses are built with conventional architecture. This presentation describes the advantages and disadvantages of synaptic chips when compared to conventional chips and how rapid rates of progress in speed, density, and power efficiency are making synaptic chips economically feasible for supercomputing applications. The biggest disadvantage for synaptic chips is in software; a new operating system and application software are needed.
In Multi-Hop Routing identifying trusted paths through TARF in Wireless sens...IJMER
The multi-hop routing in wireless sensor networks (WSNs) highly vulnerable against
identity cheating through replaying routing data. An attacker can uses this drawback to launch
various serious or even disturbing attacks against the routing protocols, like sinkhole attacks,
wormhole attacks and Sybil attacks. The situation is further forced by mobile and unkind
network conditions. old cryptographic techniques or efforts at developing trust-aware routing
protocols do not effectively address this serious problem. To secure the WSNs against attackers
misdirecting the multi-hop routing, we have designed and implemented TARF, a robust trust-aware
routing framework for dynamic WSNs. Without tight time synchronization or known geographic
information, TARF provides trustworthy and energy-efficient route. Most importantly, TARF proves
effective against those dangerous attacks developed out of identity cheat; the flexibility of
TARF is verified through extensive evaluation with both simulation and empirical experiments on
large-scale WSNs under various scenarios including mobile and RF-shielding network conditions.
Further, we have implemented allow-overhead TARF module in TinyOS; as demonstrated, this
implementation can be included into existing routing protocols with the little effort. Based on
TARF, we also demonstrated a proof-of-concept mobile target detection application that functions
well against an anti-detection mechanism. this
implementation can be included into existing routing protocols with the little effort. Based on
TARF, we also demonstrated a proof-of-concept mobile target detection application that functions
well against an anti-detection mechanism.
This presentation lists some brain-computer interface technologies that exist today and that could be attainable in future. At the end, philosophical comments about this kind of technology and transhumanism are purposed, in order to reveal the key difference between a humain brain and artificial intelligence.
With the introduction of Blue Brain technology, which is a reverse engineering, we can overcome all the brain disorders and diseases. Blue Brain is the name of the world’s first virtual brain which makes a machine, function as a human brain. Even after the death of the person the complete functional attribute of a human brain can be stored in that and can be used for further development.
Blue brain enables humans to give new dimensions to science and technology and make enormous development in making the best possible enlightenment to the present scenario.the details can be seen by going though the power point presentation
EEG Based BCI Applications with Deep LearningRiddhi Jain
Summarised a Survey Paper describing EEG Based BCI Applications and Sensing Technologies and their Computational Intelligence Approach published on Jan 28, 2020
ARTIFICIAL brain ......AI
Artificial brain (or artificial mind) is a term commonly used in the media[1] to describe research that aims to develop software and hardware with cognitive abilities similar to those of the animal or human brain. Research investigating "artificial brains" and brain emulation plays three important roles in science:
An ongoing attempt by neuroscientists to understand how the human brain works, known as cognitive neuroscience.
A thought experiment in the philosophy of artificial intelligence, demonstrating that it is possible, at least in theory, to create a machine that has all the capabilities of a human being.
A long term project to create machines exhibiting behavior comparable to those of animals with complex central nervous system such as mammals and most particularly humans. The ultimate goal of creating a machine exhibiting human-like behavior or intelligence is sometimes called strong AI.
An example of the first objective is the project reported by Aston University in Birmingham, England[2] where researchers are using biological cells to create "neurospheres" (small clusters of neurons) in order to develop new treatments for diseases including Alzheimer's, motor neurone and Parkinson's disease.
The second objective is a reply to arguments such as John Searle's Chinese room argument, Hubert Dreyfus' critique of AI or Roger Penrose's argument in The Emperor's New Mind. These critics argued that there are aspects of human consciousness or expertise that can not be simulated by machines. One reply to their arguments is that the biological processes inside the brain can be simulated to any degree of accuracy. This reply was made as early as 1950, by Alan Turing in his classic paper "Computing Machinery and Intelligence".[3]
The third objective is generally called artificial general intelligence by researchers.[4] However, Ray Kurzweil prefers the term "strong AI". In his book The Singularity is Near, he focuses on whole brain emulation using conventional computing machines as an approach to implementing artificial brains, and claims (on grounds of computer power continuing an exponential growth trend) that this could be done by 2025. Henry Markram, director of the Blue Brain project (which is attempting brain emulation), made a similar claim (2020) at the Oxford TED conference in 2009.
These slides use concepts from my (Jeff Funk) course entitled analyzing hi-tech opportunities to analyze how Neurosynaptic chips are becoming economic feasible for supercomputing applications. Neurosynaptic chips use a different architecture, one that mimics the brain with neurons and synapses. These neurons and synapses are built with conventional architecture. This presentation describes the advantages and disadvantages of synaptic chips when compared to conventional chips and how rapid rates of progress in speed, density, and power efficiency are making synaptic chips economically feasible for supercomputing applications. The biggest disadvantage for synaptic chips is in software; a new operating system and application software are needed.
In Multi-Hop Routing identifying trusted paths through TARF in Wireless sens...IJMER
The multi-hop routing in wireless sensor networks (WSNs) highly vulnerable against
identity cheating through replaying routing data. An attacker can uses this drawback to launch
various serious or even disturbing attacks against the routing protocols, like sinkhole attacks,
wormhole attacks and Sybil attacks. The situation is further forced by mobile and unkind
network conditions. old cryptographic techniques or efforts at developing trust-aware routing
protocols do not effectively address this serious problem. To secure the WSNs against attackers
misdirecting the multi-hop routing, we have designed and implemented TARF, a robust trust-aware
routing framework for dynamic WSNs. Without tight time synchronization or known geographic
information, TARF provides trustworthy and energy-efficient route. Most importantly, TARF proves
effective against those dangerous attacks developed out of identity cheat; the flexibility of
TARF is verified through extensive evaluation with both simulation and empirical experiments on
large-scale WSNs under various scenarios including mobile and RF-shielding network conditions.
Further, we have implemented allow-overhead TARF module in TinyOS; as demonstrated, this
implementation can be included into existing routing protocols with the little effort. Based on
TARF, we also demonstrated a proof-of-concept mobile target detection application that functions
well against an anti-detection mechanism. this
implementation can be included into existing routing protocols with the little effort. Based on
TARF, we also demonstrated a proof-of-concept mobile target detection application that functions
well against an anti-detection mechanism.
User Interactive Color Transformation between ImagesIJMER
Abstract: In this paper we present a process called color
transfer which can borrow one image’s color
characteristics from another. Most current colorization
algorithms either require a significant user effort or have
large computational time. Here focus on orthogonal color
space i.e. lαβ color space without correlation between the
axes is given. Here we have implemented two global color
transfer algorithms in lαβ color space using simple color
statistical information such as mean, standard deviation
and covariance between the pixels of image. Our approach
is the extension of Reinhard's. Our local color transfer
algorithm uses simple color statistical analysis to recolor
the target image according to selected color range in
source image. Target image’s color influence mask is
prepared. It is a mask that specifies what parts of target
image will be affected according to selected color range.
After that target image is recolored in lαβ color space
according to prepared color influence map. In the lαβ
color space luminance and chrominance information is
separate so it allows making image recoloring optional.
The basic color transformation uses stored color statistics
of source and target image. All the algorithms are
implemented in JAVA object oriented language. The main
advantage of proposed method over the existing one is it
allows the user to recolor a part of the image in a simple &
intuitive way, preserving other color intact & achieving
natural look.
Index Terms: color transfer, local color statistics, color
characteristics, orthogonal color space, color influence
map.
Experimental Investigation of Twin Cylinder Diesel Engine Using Diesel & Met...IJMER
In view of increasing pressure on crude oil reserves and environmental degradation as an
outcome, fuels like methanol may present a sustainable solution as it can be produced from a wide
range of carbon based feedstock. The present investigation evaluates methanol as a diesel engine fuel.
The objectives of this report is to analyze the fuel consumption and the emission characteristic of a
twin cylinder diesel engine that are using Methanol & compared to usage of ordinary diesel that are
available in the market. This report describes the setups and the procedures for the experiment which
is to analyze the emission characteristics and fuel consumption of diesel engine due to usage of the
both fuels. Detail studies about the experimental setup and components have been done before the
experiment started. Data that are required for the analysis is observed from the experiments.
Calculations and analysis have been done after all the required data needed for the thesis is obtained.
The experiment used diesel engine with no load which means no load exerted on it. A four stroke Twin
cylinder diesel engine was adopted to study the brake thermal efficiency, brake specific energy
consumption, and emissions at zero load & full load with the fuel of methanol. In this study, the diesel
engine was tested using 100% methanol. By the end of the report, the successful of the project have
been started which is Diesel engine is able to run with Methanol but the engine needs to run by using
diesel fuel first, then followed by methanol and finished with diesel fuel as the last fuel usage before the
engine turned off. The performance of the engine using Methanol fuel compared to the performance of
engine with diesel fuel. Experimental results of Methanol and Diesel fuel are also compared.
Abstract: In this paper, we define and study about a new type of generalized closed set called, g∗s-closed set.Its relationship with already defined generalized closed sets are also studied
CFD Analysis and Fabrication of Aluminium Cenosphere CompositesIJMER
Metal matrix composites are engineered materials with a combination of two or more
dissimilar materials, to obtain enhanced properties. Aluminium alloys reinforced with ceramic
particles exhibit superior mechanical properties when compared to unreinforced aluminium alloys and
hence are candidate for engineering applications. In the present investigation aluminium alloy is used
as the matrix and cenosphere as the reinforcing material. The hybrid metal matrix composite is
produced using conventional foundry techniques by casting route. The cenosphere is to be added in
2%, 4% and 6% by volume and also with the influence of the particle size of cenosphere, to the molten
metal with Magnesium, which is the main parameter for the wet ability of cenosphere and aluminium
alloy. The hybrid composite is to be tested for hardness, density, mechanical properties and impact
strength. The density decreases with increase in cenosphere content. The impact strength increases
with increase in cenosphere content. The resistances to dry wear and slurry erosive wear increases
with increase in cenosphere content and hence this material can be used as bearing material. This
composite material being less dense than aluminium can therefore be used in place of conventional
aluminium alloys in aircraft components
Fuzzy Rule Based Model for Optimal Reservoir ReleasesIJMER
The aim of this paper is to develop a Fuzzy Rule Based(FRB) model for obtaining the
optimal reservoir releases. The area considered for the study is ukai reservoir project. The data
considered are for the months of July, August, September and October for the years 2007 and 2011.
The inputs considered are inflow (MCM), Storage (MCM), Demand (MCM) and the Release (MCM) is
considered as output. Fuzzy logic, analysis is based on designing of if and then rules. Fuzzy logic
model can handle with various kinds of data regulations implication and defuzzification. The steps
involved in the development of the model include the construction of membership functions, creating
the Fuzzy rules, implication and deffuzification. The results obtained shows that the releases obtained
from the FRB model are satisfying the demand completely in all the four months, i.e. July, August,
September and October for the year 2007 and the same is observed for the year 2011. Also, a
significant amount of water is being saved, when the actual releases are compared with the releases
obtained from the FRB model.
Combating Bit Losses in Computer Networks using Modified Luby Transform CodeIJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
Low Cost Self-assistive Voice Controlled Technology for Disabled PeopleIJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
Resolution of human arm redundancy in point tasks by synthesizing two criteriaIJMER
The human arm is kinematically redundant in the task of pointing. As a result, multiple arm
configurations can be used to complete a pointing task in which the tip of the index finger is brought to a
preselected point in a 3D space. The authors have developed a four degrees of freedom (DOF)model of the
human arm with synthesis of two redundancy resolution criteria that were developed as an analytical tool
for studying the positioning tasks. The two criteria were: (1) minimizing the angular joint displacement
(Minimal Angular Displacement - MAD) and (2) averaging the limits of the shoulder joint range (Joint
Range Availability - JRA). As part of the experimental protocol conducted with ten subjects, the kinematics
of the human arm was acquired with a motion capturing system in a 3D space. The redundant joint angles
predicted by a equally weighted model synthesizing the MAD and JRA criteria resulted with a linear
correlation with the experimental data (slope=0.88; offset=1⁰; r
2=0.52). Given the experiment protocol,
individual criterion showed weaker correlation with experimental data (MAD slope=0.57, offset=14⁰,
r
2=0.36 or JRA slope=0.84, offset=-1⁰, r
2=0.45). Solving the inverse kinematics problem of articulated
redundant serials mechanism such as a human or a robotic arm has applications in fields of human-robot
interaction and wearable robotics, ergonomics, and computer graphics animation.
Artificial neural networks are fundamental means for providing an attempt at modelling the information
processing capabilities of artificial nervous system which plays an important role in the field of cognitive
science. This paper focuses the features of artificial neural networks studied by reviewing the existing research
works, these features were then assessed and evaluated and comparative analysis. The study and literature
survey metrics such as functional capabilities of neurons, learning capabilities, style of computation, processing
elements, processing speed, connections, strength, information storage, information transmission,
communication media selection, signal transduction and fault tolerance were used as basis for comparison. A
major finding in this paper showed that artificial neural networks served as the platform for neuron computing
technology in the field of cognitive science.
What is it about the human brain that makes us smarter than other animals.pdfRazaAliKhan10
The human brain is one of the most vital organs in our bodies. Humans grew smarter than other animals as a result of this brain. It is critical to understand the answer to this question, what is it about the human brain that makes us smarter than other animals? The human heart is one of the most vital organs in the body, as it is responsible for keeping us alive. It’s a four-chambered muscular organ. The heart is approximately the size of a clenched hand. The human heart is one of the most powerful and hardest-working muscles in the body, and it functions throughout a person’s life.
The human brain is one of the most vital organs in our bodies. Humans grew smarter than other animals as a result of this brain. It is critical to understand the answer to this question, what is it about the human brain that makes us smarter than other animals? The human heart is one of the most vital organs in the body, as it is responsible for keeping us alive. It’s a four-chambered muscular organ. The heart is approximately the size of a clenched hand. The human heart is one of the most powerful and hardest-working muscles in the body, and it functions throughout a person’s life.
However, this is only a small part of a more complex picture. In a study published in Nature Neuroscience, using evidence from different species and multiple neuroscientific disciplines, we show that there isn’t just one type of information processing in the brain. How information is processed also differs between humans and other primates, which may explain why our species’ cognitive abilities are so superior.
We borrowed concepts from what’s known as the mathematical framework of information theory – the study of measuring, storing and communicating digital information which is crucial to technology such as the internet and artificial intelligence – to track how the brain processes information. We found that different brain regions in fact use different strategies to interact with each other.
Some brain regions exchange information with others in a very stereotypical way, using input and output. This ensures that signals get across in a reproducible and dependable manner. This is the case for areas that are specialised for sensory and motor functions (such as processing sound, visual and movement information).
Take the eyes, for example, which send signals to the back of the brain for processing. The majority of information that is sent is duplicate, being provided by each eye. Half of this information, in other words, is not needed. So we call this type of input-output information processing “redundant”.
But the redundancy provides robustness and reliability – it is what enables us to still see with only one eye. This capability is essential for survival. In fact, it is so crucial that the connections between these brain regions are anatomically hard-wired in the brain, a bit like a telephone landline.
However, not all information provided by the eyes is redundant. Combining informa
A Study on Translucent Concrete Product and Its Properties by Using Optical F...IJMER
- Translucent concrete is a concrete based material with light-transferring properties,
obtained due to embedded light optical elements like Optical fibers used in concrete. Light is conducted
through the concrete from one end to the other. This results into a certain light pattern on the other
surface, depending on the fiber structure. Optical fibers transmit light so effectively that there is
virtually no loss of light conducted through the fibers. This paper deals with the modeling of such
translucent or transparent concrete blocks and panel and their usage and also the advantages it brings
in the field. The main purpose is to use sunlight as a light source to reduce the power consumption of
illumination and to use the optical fiber to sense the stress of structures and also use this concrete as an
architectural purpose of the building
Developing Cost Effective Automation for Cotton Seed DelintingIJMER
A low cost automation system for removal of lint from cottonseed is to be designed and
developed. The setup consists of stainless steel drum with stirrer in which cottonseeds having lint is mixed
with concentrated sulphuric acid. So lint will get burn. This lint free cottonseed treated with lime water to
neutralize acidic nature. After water washing this cottonseeds are used for agriculter purpose
Study & Testing Of Bio-Composite Material Based On Munja FibreIJMER
The incorporation of natural fibres such as munja fiber composites has gained
increasing applications both in many areas of Engineering and Technology. The aim of this study is to
evaluate mechanical properties such as flexural and tensile properties of reinforced epoxy composites.
This is mainly due to their applicable benefits as they are light weight and offer low cost compared to
synthetic fibre composites. Munja fibres recently have been a substitute material in many weight-critical
applications in areas such as aerospace, automotive and other high demanding industrial sectors. In
this study, natural munja fibre composites and munja/fibreglass hybrid composites were fabricated by a
combination of hand lay-up and cold-press methods. A new variety in munja fibre is the present work
the main aim of the work is to extract the neat fibre and is characterized for its flexural characteristics.
The composites are fabricated by reinforcing untreated and treated fibre and are tested for their
mechanical, properties strictly as per ASTM procedures.
Hybrid Engine (Stirling Engine + IC Engine + Electric Motor)IJMER
Hybrid engine is a combination of Stirling engine, IC engine and Electric motor. All these 3 are
connected together to a single shaft. The power source of the Stirling engine will be a Solar Panel. The aim of
this is to run the automobile using a Hybrid engine
Fabrication & Characterization of Bio Composite Materials Based On Sunnhemp F...IJMER
The present day technology demands eco-friendly developments. In this era the
composite material are playing a vital roal in different field of Engineering .The composite materials
are using as a principle materials. Nowaday the composite materials are utilizing as a important
component of engineering field .Where as the importance of the applications of composites is well
known, but thrust on the use of natural fibres in it for reinforcement has been given priority for some
times. But changing from synthetic fibres to natural fibres provides only half green-composites. A
partial green composite will be achieved if the matrix component is also eco-friendly. Keeping this in
view, a detailed literature surveyed has been carried out through various issues of the Journals
related to this field. The material systems used are sunnhemp fibres. Some epoxy and hardener has
been also added for stability and drying of the bio-composites. Various graphs and bar-charts are
super-imposed on each other for comparison among themselves and Graphs is plotted on MAT LAB
and ORIGIN 6.0 software. To determining tensile strengths, Various properties for different biocomposites
have been compared among themselves. Comparison of the behaviour of bio-composites of
this work has been also compare with other works. The bio-composites developed in this work are
likely to get applications in fall ceilings, partitions, bio-degradable packagings, automotive interiors,
sports things (e.g. rackets, nets, etc.), toys etc.
Geochemistry and Genesis of Kammatturu Iron Ores of Devagiri Formation, Sandu...IJMER
The Greenstone belts of Karnataka are enriched in BIFs in Dharwar craton, where Iron
formations are confined to the basin shelf, clearly separated from the deeper-water iron formation that
accumulated at the basin margin and flanking the marine basin. Geochemical data procured in terms of
major, trace and REE are plotted in various diagrams to interpret the genesis of BIFs. Al2O3, Fe2O3 (T),
TiO2, CaO, and SiO2 abundances and ratios show a wide variation. Ni, Co, Zr, Sc, V, Rb, Sr, U, Th,
ΣREE, La, Ce and Eu anomalies and their binary relationships indicate that wherever the terrigenous
component has increased, the concentration of elements of felsic such as Zr and Hf has gone up. Elevated
concentrations of Ni, Co and Sc are contributed by chlorite and other components characteristic of basic
volcanic debris. The data suggest that these formations were generated by chemical and clastic
sedimentary processes on a shallow shelf. During transgression, chemical precipitation took place at the
sediment-water interface, whereas at the time of regression. Iron ore formed with sedimentary structures
and textures in Kammatturu area, in a setting where the water column was oxygenated.
Experimental Investigation on Characteristic Study of the Carbon Steel C45 in...IJMER
In this paper, the mechanical characteristics of C45 medium carbon steel are investigated
under various working conditions. The main characteristic to be studied on this paper is impact toughness
of the material with different configurations and the experiment were carried out on charpy impact testing
equipment. This study reveals the ability of the material to absorb energy up to failure for various
specimen configurations under different heat treated conditions and the corresponding results were
compared with the analysis outcome
Non linear analysis of Robot Gun Support Structure using Equivalent Dynamic A...IJMER
Robot guns are being increasingly employed in automotive manufacturing to replace
risky jobs and also to increase productivity. Using a single robot for a single operation proves to be
expensive. Hence for cost optimization, multiple guns are mounted on a single robot and multiple
operations are performed. Robot Gun structure is an efficient way in which multiple welds can be done
simultaneously. However mounting several weld guns on a single structure induces a variety of
dynamic loads, especially during movement of the robot arm as it maneuvers to reach the weld
locations. The primary idea employed in this paper, is to model those dynamic loads as equivalent G
force loads in FEA. This approach will be on the conservative side, and will be saving time and
subsequently cost efficient. The approach of the paper is towards creating a standard operating
procedure when it comes to analysis of such structures, with emphasis on deploying various technical
aspects of FEA such as Non Linear Geometry, Multipoint Constraint Contact Algorithm, Multizone
meshing .
Static Analysis of Go-Kart Chassis by Analytical and Solid Works SimulationIJMER
This paper aims to do modelling, simulation and performing the static analysis of a go
kart chassis consisting of Circular beams. Modelling, simulations and analysis are performed using 3-D
modelling software i.e. Solid Works and ANSYS according to the rulebook provided by Indian Society of
New Era Engineers (ISNEE) for National Go Kart Championship (NGKC-14).The maximum deflection is
determined by performing static analysis. Computed results are then compared to analytical calculation,
where it is found that the location of maximum deflection agrees well with theoretical approximation but
varies on magnitude aspect.
In récent year various vehicle introduced in market but due to limitation in
carbon émission and BS Séries limitd speed availability vehicle in the market and causing of
environnent pollution over few year There is need to decrease dependancy on fuel vehicle.
bicycle is to be modified for optional in the future To implement new technique using change in
pedal assembly and variable speed gearbox such as planetary gear optimise speed of vehicle
with variable speed ratio.To increase the efficiency of bicycle for confortable drive and to
reduce torque appli éd on bicycle. we introduced epicyclic gear box in which transmission done
throgh Chain Drive (i.e. Sprocket )to rear wheel with help of Epicyclical gear Box to give
number of différent Speed during driving.To reduce torque requirent in the cycle with change in
the pedal mechanism
Integration of Struts & Spring & Hibernate for Enterprise ApplicationsIJMER
The proposal of this paper is to present Spring Framework which is widely used in
developing enterprise applications. Considering the current state where applications are developed using
the EJB model, Spring Framework assert that ordinary java beans(POJO) can be utilize with minimal
modifications. This modular framework can be used to develop the application faster and can reduce
complexity. This paper will highlight the design overview of Spring Framework along with its features that
have made the framework useful. The integration of multiple frameworks for an E-commerce system has
also been addressed in this paper. This paper also proposes structure for a website based on integration of
Spring, Hibernate and Struts Framework.
Microcontroller Based Automatic Sprinkler Irrigation SystemIJMER
Microcontroller based Automatic Sprinkler System is a new concept of using
intelligence power of embedded technology in the sprinkler irrigation work. Designed system replaces
the conventional manual work involved in sprinkler irrigation to automatic process. Using this system a
farmer is protected against adverse inhuman weather conditions, tedious work of changing over of
sprinkler water pipe lines & risk of accident due to high pressure in the water pipe line. Overall
sprinkler irrigation work is transformed in to a comfortableautomatic work. This system provides
flexibility & accuracy in respect of time set for the operation of a sprinkler water pipe lines. In present
work the author has designed and developed an automatic sprinkler irrigation system which is
controlled and monitored by a microcontroller interfaced with solenoid valves.
On some locally closed sets and spaces in Ideal Topological SpacesIJMER
In this paper we introduce and characterize some new generalized locally closed sets
known as
δ
ˆ
s-locally closed sets and spaces are known as
δ
ˆ
s-normal space and
δ
ˆ
s-connected space and
discussed some of their properties
Intrusion Detection and Forensics based on decision tree and Association rule...IJMER
This paper present an approach based on the combination of, two techniques using
decision tree and Association rule mining for Probe attack detection. This approach proves to be
better than the traditional approach of generating rules for fuzzy expert system by clustering methods.
Association rule mining for selecting the best attributes together and decision tree for identifying the
best parameters together to create the rules for fuzzy expert system. After that rules for fuzzy expert
system are generated using association rule mining and decision trees. Decision trees is generated for
dataset and to find the basic parameters for creating the membership functions of fuzzy inference
system. Membership functions are generated for the probe attack. Based on these rules we have
created the fuzzy inference system that is used as an input to neuro-fuzzy system. Fuzzy inference
system is loaded to neuro-fuzzy toolbox as an input and the final ANFIS structure is generated for
outcome of neuro-fuzzy approach. The experiments and evaluations of the proposed method were
done with NSL-KDD intrusion detection dataset. As the experimental results, the proposed approach
based on the combination of, two techniques using decision tree and Association rule mining
efficiently detected probe attacks. Experimental results shows better results for detecting intrusions as
compared to others existing methods
Natural Language Ambiguity and its Effect on Machine LearningIJMER
"Natural language processing" here refers to the use and ability of systems to process
sentences in a natural language such as English, rather than in a specialized artificial computer
language such as C++. The systems of real interest here are digital computers of the type we think of as
personal computers and mainframes. Of course humans can process natural languages, but for us the
question is whether digital computers can or ever will process natural languages. We have tried to
explore in depth and break down the types of ambiguities persistent throughout the natural languages
and provide an answer to the question “How it affects the machine translation process and thereby
machine learning as whole?” .
Today in era of software industry there is no perfect software framework available for
analysis and software development. Currently there are enormous number of software development
process exists which can be implemented to stabilize the process of developing a software system. But no
perfect system is recognized till yet which can help software developers for opting of best software
development process. This paper present the framework of skillful system combined with Likert scale. With
the help of Likert scale we define a rule based model and delegate some mass score to every process and
develop one tool name as MuxSet which will help the software developers to select an appropriate
development process that may enhance the probability of system success.
Material Parameter and Effect of Thermal Load on Functionally Graded CylindersIJMER
The present study investigates the creep in a thick-walled composite cylinders made
up of aluminum/aluminum alloy matrix and reinforced with silicon carbide particles. The distribution
of SiCp is assumed to be either uniform or decreasing linearly from the inner to the outer radius of
the cylinder. The creep behavior of the cylinder has been described by threshold stress based creep
law with a stress exponent of 5. The composite cylinders are subjected to internal pressure which is
applied gradually and steady state condition of stress is assumed. The creep parameters required to
be used in creep law, are extracted by conducting regression analysis on the available experimental
results. The mathematical models have been developed to describe steady state creep in the composite
cylinder by using von-Mises criterion. Regression analysis is used to obtain the creep parameters
required in the study. The basic equilibrium equation of the cylinder and other constitutive equations
have been solved to obtain creep stresses in the cylinder. The effect of varying particle size, particle
content and temperature on the stresses in the composite cylinder has been analyzed. The study
revealed that the stress distributions in the cylinder do not vary significantly for various combinations
of particle size, particle content and operating temperature except for slight variation observed for
varying particle content. Functionally Graded Materials (FGMs) emerged and led to the development
of superior heat resistant materials.
Energy Audit is the systematic process for finding out the energy conservation
opportunities in industrial processes. The project carried out studies on various energy conservation
measures application in areas like lighting, motors, compressors, transformer, ventilation system etc.
In this investigation, studied the technical aspects of the various measures along with its cost benefit
analysis.
Investigation found that major areas of energy conservation are-
1. Energy efficient lighting schemes.
2. Use of electronic ballast instead of copper ballast.
3. Use of wind ventilators for ventilation.
4. Use of VFD for compressor.
5. Transparent roofing sheets to reduce energy consumption.
So Energy Audit is the only perfect & analyzed way of meeting the Industrial Energy Conservation.
An Implementation of I2C Slave Interface using Verilog HDLIJMER
The focus of this paper is on implementation of Inter Integrated Circuit (I2C) protocol
following slave module for no data loss. In this paper, the principle and the operation of I2C bus protocol
will be introduced. It follows the I2C specification to provide device addressing, read/write operation and
an acknowledgement. The programmable nature of device provide users with the flexibility of configuring
the I2C slave device to any legal slave address to avoid the slave address collision on an I2C bus with
multiple slave devices. This paper demonstrates how I2C Master controller transmits and receives data to
and from the Slave with proper synchronization.
The module is designed in Verilog and simulated in ModelSim. The design is also synthesized in Xilinx
XST 14.1. This module acts as a slave for the microprocessor which can be customized for no data loss.
Discrete Model of Two Predators competing for One PreyIJMER
This paper investigates the dynamical behavior of a discrete model of one prey two
predator systems. The equilibrium points and their stability are analyzed. Time series plots are obtained
for different sets of parameter values. Also bifurcation diagrams are plotted to show dynamical behavior
of the system in selected range of growth parameter
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Conversion of Artificial Neural Networks (ANN) To Autonomous Neural Networks
1. International Journal of Modern Engineering Research (IJMER)
www.ijmer.com Vol.3, Issue.3, May-June. 2013 pp-1304-1306 ISSN: 2249-6645
www.ijmer.com 1304 | Page
H Bhargav,1
Dr. Nataraj K. R,2
1
Assistant Professor, Department of Electrical and Electronics Engineering.Vidyavardhaka College of Engineering, Mysore,
Karnataka, India.
2
Professor, Department of Electronics and Communications Engineering.SJB Institute of Technology, Bangalore,
Karnataka, India.
Abstract: This article points out some serious drawbacks of Artificial Neural Networks, when compared to human brain.
According to this article definitely there is a need for implementing Artificial Neural Networks with a change in underlying
concepts. To do this a clear picture of brain learning mechanism, which is free from all the possible misconceptions is
essential. So this article makes an attempt to notify the aspects that need to be considered in order to make neural networks
‘autonomous bodies’, just like human brain.
Keywords: ANN, Autonomous Neural networks, brain-like learning and Subsystem control theory.
I. INTRODUCTION
The greatest drawback of the current models of artificial neural networks is that they are human- intervention
systems. Also their learning algorithms need constant attention. These learning algorithms cannot be used in future robots or
any other systems which are supposed to be autonomous. It is therefore impossible to build autonomous systems without
having autonomous learning algorithms.
This article points out the important differences from Artificial Neural Networks and human brain. Also this article
points out the immediate need to improve the standards of current neural networks so that they can be rendered “Self-
learning” and “completely autonomous.
Artificial neural networks should be developed in such a way that one who uses ANN should feel he has actually
employed a human being for addressing the given real time problem. ANN can be approximated to human brain by
improving its standards of learning. We know that ANNs are derivatives of human brain. But if ANN are developed such
that they are capable of taking decisions independently under all the conditions without any human intervention, then ANNs
can be nearly approximated to human brains.
II. Brain Learning Mechanism
The brain is a wonderful creation in the entire nature. While the animals use it only for basic needs, human brains
can perform wonderful tasks. Human brain has inspired many scientists and researchers to construct artificial neural
networks.
Brain is a vast network comprising of millions of neurons and connectivities to various organs. Brain is like a
central processing unit,which governs the body functions effectively. Human brain is almost an autonomous system which
does not require outside processes for controlling its learning phenomena. Also the brain recollects the previous experiences
and its own interpretations to take decisions in a particular issue.
Sometimes the brain is also influenced by the modes of action and the decisions are made according to one of the modes of
action. The other interesting feature of the brain is that it achieves a good co-ordination between the functioning of various
organs.
Two of the main functions of the brain are memory and learning. There are of course many categories of memory
(short term, medium term, long term, working memory, episodic memory and so on) and of learning (supervised,
unsupervised, inductive, reinforcement and so on). In order to characterize the learning behavior of the brain, it is necessary
to distinguish between these two functions. Learning generally implies learning of rules from examples. Memory, on the
other hand, implies simple storing of facts and information for later recall (e.g. an image, a scene, a song, an instruction).
Sometimes memory is often confused with learning. But the processes of memorization are different from that of learning.
So memory and learning are not the same.
III. Misconceptions About Human Brain
There are several concoctions about human brain. Many researchers say that human brain is inferior to Artificial
Neural Network. In fact Artificial Neural Networks are themselves derivatives of human brain. Human brain has got
unlimited potential, with which it can explore the finest aspects of any concept and arrive at the proper conclusion. If human
brain is properly understood then Artificial Neural Networks may be designed with a difference so that their degree of
resemblance with human brain increases. Despite of numerous advancements in the field of Artificial Neural Networks,
ANNs can’t be still regarded as “duplicate” of human brain. Therefore it is of utmost importance to improve the features of
ANN, so that it develops the brain like capacity to address the real time problems.
Conversion of Artificial Neural Networks (ANN) To
Autonomous Neural Networks
2. International Journal of Modern Engineering Research (IJMER)
www.ijmer.com Vol.3, Issue.3, May-June. 2013 pp-1304-1306 ISSN: 2249-6645
www.ijmer.com 1305 | Page
Some of the misconceptions of human brain are:
“A human's knowledge is volatile and may not become permanent. There are several factors that cause brain
cells to die and if they do, the information that is stored in that part is lost and we start to forget”. - which is not
very true because brain has distributed memory system and the memory loss is a very rare case. On the other hand if
brain is utilized effectively then knowledge is never lost.
“Brain is always provided with the learning parameters to address a problem”- In order to seek a solution for the
given problem or to generalize well, human brain should be able to decide upon the network parameters like number of
layers, number of neurons per layer, connection strengths and so on. So the learning parameters and networks
themselves do not come “readymade”. Since the natures of problems differ, brain has to decide on the different network
designs and network parameters internally.
“Brain does not store any information prior to learning and learns instantaneously”- one may think that if there is
no memory requirement then the system is very efficient, but it consumes more time for processing. In this respect,
human brain is superior to ANN because it has a memory. Human brain never learns instantaneously but it happens
based on the information collected prior to learning. This conception violates the very basic behavioral facts.
Remembering relative facts and examples is a part of human learning.
“Human brain’s speed of processing is less compared to that of Artificial Neural Networks”- In fact human brain
can imagine anything at a greater speed compared to that of air. Artificial Neural Networks have to be first trained and
after the learning phase is over, their speed can be measured. Sometimes speed also refers to a proper decision taking
capability of a system. Before taking a task for processing, if a processing system can set priorities for the tasks or find
effective ways to solve it then the system is to have speeded up the processing. On the other hand if the system simply
processes the task without taking into account its pros & cons then it is actually wasting the precious time.
“Each Neuron in the brain is an autonomous body”. – The notion that each neuron adjusts its weights solely based
on its inputs and outputs is not supported by any neurobiological evidence. In fact external agents can also influence the
synaptic adjustments. If backpropagation learning algorithm is considered, then we can notice that each cell stores
information about the input, output, error in processing the task by the network and also the contribution of individual
cell to this error. This implies that no other entity external to cell or neuron is allowed to change its connection strengths.
But this is logically inconsistent.
IV. How To Turn ANN To Autonomous Neural Network
The field of Artificial Neural Networks developed several learning algorithms over the years that work well only
when there is human intervention. In order to make ANN to work properly their learning rates need to be reset and
readjusted, and also different network designs have to be tried so that they can generalize well. Everything needs to be
relearned from scratch when there is catastrophic forgetting in the network. There is a long list of such drawbacks that need
to be seriously considered. One of the founder of this field and a past president of the International Neural Network Society
(INNS) confided that “the neuro-boom is over.”But many other scholars have kept on fighting the arguments against the
current science on brain-like learning.
Minsky and Papert not only showed the limitations of the the perceptrons, the simple neural networks and also
raised the deeper question of computational complexity of learning algorithms. Despite all the deeper and more disturbing
questions raised by thoughtful critics, the neural network field is moving heedlessly with its research agenda. Now faced
with fundamental challenges to the assumptions behind their brain like learning algorithms, prominent researchers in the
field are finally calling for a “shake up of the field of neural networks” and for its “rebirth.”
Artificial Neural Networks can become autonomous bodies if they are embodied with various capabilities as listed below:
ANN should be equipped with memory, so that it operates at greater speed.
ANN should be capable of taking decisions about the task selection and processing: this means that ANN should be
capable of setting priorities to the tasks and also deciding the best possible way of processing, instead of merely
operating on a given task.
ANN should be able to aim and fix target for processing tasks without which the processing of tasks would take
more time.
ANN should not be problem specific but should be able to address any problem.
ANN should be capable of adjusting both the weights of synaptic connections as well the structure itself.
ANN should also be able to sense the “situations” in the surrounding environment and address the given problem
without the aid of external teacher. Taking the situations (requirements, rules etc.) into consideration, ANN should act in
order to get the desired output.
ANN should be having flexibility to switch over to different modes of action. For instance if we desire ANN to work
in the mode of passion then ANN should permit for the same.
ANN should have subsystems within itself that can control other subsystems because of which any external source
can control the behavior of neuron. This is quite different from “local learning concept” of current ANN technology
[10].
3. International Journal of Modern Engineering Research (IJMER)
www.ijmer.com Vol.3, Issue.3, May-June. 2013 pp-1304-1306 ISSN: 2249-6645
www.ijmer.com 1306 | Page
Artificial Neural Networks can achieve “brain-like learning” if they are equipped with all these abilities. Artificial
Neural Networks should be a combination of various activation functions and different topologies to become autonomous
bodies.
V. Conclusions
The greatest drawback of the existing theories of artificial neural networks is the characterization of an autonomous
learning system such as the brain. Despite of the clear definitions of the internal mechanisms of the brain [12], no one has
characterized in a similar manner the external behavioral characteristics that they are supposed to produce.
Consequently, the ANN underwent algorithm development keeping in view local, autonomous learning, memory
less learning, and instantaneous learning rather than from the point of view of "external behavioral characteristics" of human
learning. If that set of external characteristics cannot be reproduced by a certain conjecture about the internal mechanisms,
than that conjecture is not a valid one.
The current article essentially points to some of the current notions of human learning and showed their logical
inconsistencies. So there is definitely a need for some new ideas about the internal mechanisms of the brain.
It would be better if the current ANN systems inadvertently acknowledge the ideas listed in the previous section and
the most important among them is the last one, which asks us to use the concept of “master or controlling subsystem” that
designs networks and sets learning parameters for them. Very recently has such non local means of learning has been used
effectively to develop powerful learning algorithms that can design and train networks in polynomial time complexity [2, 9,
10].In addition, this “subsystem control” framework resolves many of the problems and dilemmas of current ANNs. Under
such a framework, learning need not be necessarily instantaneous, but can wait until some information is collected about the
problem. Learning can always be invoked by a controlling subsystem at a later point in time. This would also facilitate
understanding the complexity of the problem before it has to be actually tackled, from the information that has been
collected and stored already. Such a framework would also resolve the network design dilemma and the problems of
algorithmic efficiency that have been negatively influencing this field for so long [2,9,10].So one can argue strongly for
theories like “subsystem control” that are related to human brain and make use of such concepts in designing ANNs. If
ANNs are designed with due considerations to actual behavior of human brain then undoubtedly ANNs become Autonomous
Neural Networks.
References
[1] Church land, P. and Sejnowski, T. The Computational Brain. MIT Press, Cambridge, MA, 1992.
[2] Glover, F. Improved Linear Programming Models for Discriminant Analysis. Decision Sciences, 21 (1990), 4, 771-785.
[3] Grossberg, S. Nonlinear neural networks: principles, mechanisms, and architectures. Neural Networks, 1 (1988), 17-61.
[4] Hebb, D. O. The Organization of Behavior, a Neuropsychological Theory. New York: John Wiley, 1949.
[5] Levine, D. S. Introduction to Neural and Cognitive Modeling, Hillsdale, NJ: Lawrence Erlbaum, 1998.
[6] Minsky, M. and Papert, S. Perceptrons. The MIT Press, Cambridge, MA, 1988.
[7] Moody, J. & Darken, C. Fast Learning in Networks of Locally-Tuned Processing Units, Neural Computation. 1 (1989), 2, 281-294.
[8] Reilly, D.L., Cooper, L.N. and Elbaum, C. A Neural Model for Category Learning. Biological Cybernetics, 45 (1982), 35-41.
[9] Roy, A., Govil, S. & Miranda, R. An Algorithm to Generate Radial Basis Function (RBF)-like Nets for Classification Problems.
Neural Networks, 8 (1995), 2, 179-202.
[10] Roy, A. Summary of panel discussion at ICNN’97 on connectionist learning. "Connectionist Learning: Is it Time to Reconsider the
Foundations." INNS/ENNS/JNNS Newsletter, appearing with Neural Networks, 11 (1998), 2.
[11] Rumelhart, D.E., and McClelland, J.L.(eds.) Parallel Distributed Processing: Explorations in Microstructure of Cognition, Vol. 1:
Foundations. MIT Press, Cambridge, MA., 1986, 318-362.
[12] Rumelhart, D.E. The Architecture of Mind: A Connectionist Approach. Chapter 8 in Haugeland, J. (ed), Mind Design II, 1997, MIT
Press, 205-232.