Cognitive Computing - Things Are Going
to Get Scary
Related Links
How to Cut the Cord and Go Cable Free

Do Cell Phones Cause Brain Cancer?

7 Foods That Can Help You Ace an Exam

Do Laptops Reduce Sperm Count? -New Study

Top 7 Health Risks of Video Game Marathons

Top 7 Mobile Apps That Can Transform Your Health

BodyBuilding-How Much Protein Do You Need?

Whey Versus Creatine-Which Is Better?

Zinc Increases Your Testosterone Level

Fatty Diet Linked to Prostate Cancer

Foods That Strengthen Erectile Performance
Blood Pressure-What It Means
Foods That Reduce Blood Pressure
Penis Shaving Bumps-Home Remedies
Yoga That Improves Erectile Performance
Better Tasting Sperm
Get Lean Diet for Men

October 31, 2013
By Eric Bell, Contributing Columnist and Editorial Staff of MangoBoss

Remember when you used to use your cell phones simply to make
phone calls and then, one day, Apple introduced something called an
iPhone and everything changed?  The iPhone wasn’t just a new
generation of cell phone—it conceived of a cellphone platform as
something more than just a phone service. It conceived of a broader
platform on which some many different new vectors of applications
having nothing to do with making a simple call could take place.   In
science, that kind of shift of thinking introduces something so new that
it establishes a new paradigm.  

Well, a similar seismic change is being talked about and experimented
with in the field of computing itself.  Computing once was thought of as
hired-wired into a machine.  The machine could be built with tinier and
faster parts to make it cranked out computing faster and faster but it
was still computing.  

Now, scientists are conceiving computing differently. They are asking a
different question –can a computing model be built which can free the
computer not just to try to once day imitate or match human computing
speed but can a computer be built which can be unleashed from the
confines of mechanical thinking. In this new paradigm, a computer
would not just learn—it would learn to learn better and better. It would
break free of the bonds of the originator’s concept of what it is capable
of and, like humans, explore new boundaries and frontiers on its own.
It would have its own brain.

The IBM has recently decided to put some effort into utilizing some of
its resources to perform research into the cognitive computing field.
The finer details of cognitive computing are quite complex, but the
potential from the results of this field are huge. Cognitive computing
essentially aims to create a better set of artificial intelligence (AI).
The AI that we have now relies on information being hard
coded into
machines. If there is any problem that cannot be solved by the
information hard
coded into a particular machines AI, then the problem
will not get solved by that machine.

When Machines Start to Learn

Cognitive computing aims to have the AI learn, grow, and ultimately
become smarter as time goes on. It will allow the AI of a machine to
problem solve based off the information provided, environment, etc.

As the machine spends more time problem solving, and runs into more
information, it grows and becomes smarter, much like a human.

Scary stuff, true, but an inevitable part of your future.

The dream of those working in AI is not to create a computer as smart
as a human. It is to create a better species better than humans, able to
think not just faster but better.  This  machine would go beyond
duplication to innovation.

Researchers are still years away from having AI completely ramped up
with cognitive computing, which is probably fine with people worried
about someone creating a real life Skynet. Cognitive computing does
pose some interesting questions for what could potentially come of
machines with the type of AI that cognitive computing might produce.
Given how fast technology is, and how fast it will become in the future,
there could be a lot of benefits and drawbacks to incredible AI. As
much as AI could produce machines that could help us solve a lot of
problems, how do we keep it in check? How do we make sure that it
does not completely backfire on society? These are both problems we
will have to solve as the technology becomes more evident.

IBM, and other research facilities are hoping for a few things to happen
with the creation of cognitive computing. What the ultimate goal is for
many of these people is that the AI can be applied to technology in the
work place. The technology would be smart enough to know how to do
a lot more problem solving then it currently does.

Right now technology is driven through what we directly input into it.
Unless there is a major malfunction, technology will not do what you do
not ask it too.  With this new AI, technology will be able to assess the
information of a problem and come up with ideas, and solutions. This
has the potential to provide a lot of useful info for businesses that they
might not be able to come up with on their own.

The goal is not to have these machines, as they become experts, to
replace human experts. The aim is to have a new, highly intelligent
decision support device. Many people are going to be worried about
job loss if AI becomes too powerful, as technology is increasingly
replacing the jobs of people around the world. If businesses are
utilizing this technology in the way that it is meant to be used, it will in
fact not replace jobs. It should just bolster profits and efficiency within
a company.

For the average consumer, cognitive computing also offers some good
advantages. Speech recognition is currently below where many people
would like. All we have to do is take a look at Apples Siri to see that
speech recognition is not very good. Cognitive computing would allow
for better speech recognition. Because it learns so well, and so quickly,
the application for speech recognition through this type of computing is
huge. Especially in the mobile markets where people are less wanting to
have to type.

Cognitive computing could also offer the benefit of good eye
recognition. With an ever increasing desire for security, and finger print
scanning becoming quite good. People will be looking for other ways to
increase security on many of their devices. With no 2 eyes being the
same, the ability to perform eye recognition in replacement of, or in
parallel to finger print recognition, offers huge security benefits.
Anything to put the consumers mind at ease could be a huge advantage
to any company looking to offer this type of security.

Cognitive computing offers a number of benefits to both businesses
and consumers. It will be exciting to see when this will really come to
market, and what the result will be. Cognitive computing is going to no
doubt have a huge impact on how people use technology. I wouldn’t
be surprised if it comes out in full force within 2-5 years, and we will
see it in places we never imagined.


The Man Who Died After Playing Video Games for 12 Hours

Top 7 Mobile Apps to Transform Your Health

Do Laptops Cause Low Sperm Counts?

Soy Foods Reduce Sperm Count

Snoring Affects Erections

Build Up Your Arms-Ideal Rotation Routine

Whey Versus Creatine-Which One Is Better for Strength

Celebrity Workouts

Foods That Strengthen Erectile Performance
Home  > General  > > Here

Health and Diet Index:
Foods That Help You Maintain Your Erection
Foods That Make You Bald
Stop Snoring-Tips That Work
Waist-to-Hip The New Number That Counts
Tiger's Core Work-Out
Six Pack Abs The Work-outs That Work
The Add Muscle Diet
Lose 10 lbs-Simple  Diet
Prostate Cancer Linked to Fatty Diet

Low Folate Harms Sperm-New Study

Exercises That Improve Erectile Function
Men Who Prefer Masturbation
Benefits of Masturbation
What Is Normal Height for a Man?

Male Baldness Affected By Diet
Free Yourself--Work At Home LatestListings

Galleries of the Week-Browse

Galleries -Actresses

Jessica Alba
Eva Mendes

Galleries -Singers

Galleries Sexy Legs

Man Polls of the Month-Below

If You Had to Sleep with a Woman Other Than Your
Wife or Girlfriend, Who Would It Be?-Vote

About Us                                           

Privacy Policy              


                    (c) copyright 2008 -201
8, and all prior years, and its parent network. All Rights Reserved.
Subscribe in a reader