How To Change Perception And Get Rid Of Undesired Thoughts

Pessimism and unnecessary thought can sometimes be much more than a simple distraction. One has to change one’s perception altogether for peace of mind, which is not an easy task to accomplish.

Perception is something that is intimately connected with the senses of the body. Through these senses, we are able to perceive things around us, which is the foundation for out psychology. We perceive the smell of a flower, or the sight of firewqorks in the sky. However, there are two definite aspects to perception-

  • Positive perception
  • Negative perception

The first kind is where people have a positive outlook towards life, when they are able to work and function with confidence and are able to battle through the problem in life. However, the people who suffer from the second kind that is negative perception tend to suffer from severe psychological problems, which may lead to a degraded quality of life. They suffer from problems such as low self-esteem, depression, anxiety, and other personal and social issues, which affects their mindset.

How To Change Negative perception

Out studies have shown that there is no general or universal way in which the perception of a person can be changed, because it is shaped and molded right from when their senses become active during the first few years of life. However, one place where you can start is by distracting yourself. It helps to get occupied with a new hobby, changing jobs, or doing something drastic which will take up all your concentration and effort. In addition to that, exposure to a completely new environment is also a great way to change the perception of an individual. From our research work at this website you can get myriad of information about this exertion.

The Effect Of Perception In Psychology

The effect of how we perceive things on our mindset is something which has been of great interest to scientists and psychologists all over the world. Here are some inferences from out studies.

Perception is the ability to witness, and sense things around you in the immediate environment. It is facilitated through the five senses in the human body. Perception is created right from the moment children are born, and they are able to understand the world around them. However, it is a very subjective matter, and everyone has different perceptions of the same or different things. Psychology on the other hand is the mindset of the people. it is the way in which people think and consequently, the way in which they behave, which makes them individual human beings.

There is a sure connection between the two, as has been reveled buy our studies. Perception is how we see things, whether it is real or not, and that is what essentially shapes our psychology. It can be called the very personal part of one’s psychology, because even if two people have grown up together in the same kind of environment, they may not perceive things in the same manner. Hence, their psychology will be different.

Psychology For Perception

Psychological studies are conducted to determine how different people perceive the same things. Someone who is highly spiritual and believes in spirits and the like will be able to find something supernatural in everything and every situation. However, someone who is dubious about all these things may dismiss it all, and be able to walk with confidence, even in the eeriest of situations. The two people perceive the same environment in a very different way, yet they do so because of their psychology. Hence the two are inter-dependent. 

How to Get Rid of Unwanted Negative Thoughts

Another instinctive way for keeping away nonstop thoughts is to put ourselves under stress. The thinking here is that the force will leave little mental energy for the thoughts that are troubling us.

Spending time with Positive people will subconsciously do a lot support to adjust positive thinking. Positive people give us energy and strength required to attain tasks which raises our confidence more.

Pick up a new hobby. If your town has a newsletter, find the latest copy and see if there are any advertisements for clubs. Alternatively you could ask your friends what hobbies they have, and see if you can join. Having a friend there at a new experience can really help.

 

Organize an event, such as going out with your friends to see a show. This will give you something to look forward to.

Meditation promotes an attitude of compassion and non-judgment towards the thoughts that flit through the mind. This may also be a helpful approach to unwanted repetitive thoughts.

While continuously trying to suppress a thought makes it come back stronger, postponing it until later can work.

Researchers have tried asking those with persistent anxious thoughts to postpone their worrying until a designated 30-minute ‘worry period’. Some studies suggest that people find this works as a way of side-stepping thought suppression.

Language use is simpler than previously thought, study suggests

New research suggests that language is actually based on simpler sequential structures, like clusters of beads on a string. (Credit: © N-Media-Images / Fotolia)

For more than 50 years, language scientists have assumed that sentence structure is fundamentally hierarchical, made up of small parts in turn made of smaller parts, like Russian nesting dolls.

A new Cornell study suggests language use is simpler than they had thought.

Co-author Morten Christiansen, Cornell professor of psychology and co-director of the Cornell Cognitive Science Program, and his colleagues say that language is actually based on simpler sequential structures, like clusters of beads on a string.

"What we're suggesting is that the language system deals with words by grouping them into little clumps that are then associated with meaning," he said.

Sentences are made up of such word clumps, or "constructions," that are understood when arranged in a particular order. For example, the word sequence "bread and butter" might be represented as a construction, whereas the reverse sequence of words ("butter and bread") would likely not.

The sequence concept has simplicity on its side; language is naturally sequential, given the temporal cues that help us understand and be understood as we use language. Moreover, the hierarchy concept doesn't take into account the many other cues that help convey meaning, such as the setting and knowing what was said before and the speaker's intention.

The researchers drew on evidence in language-related fields from psycholinguistics to cognitive neuroscience. For example, research in evolutionary biology indicates that humans acquired language (and animals did not) because we have evolved abilities in a number of areas, such as being able to correctly guess others' intentions and learn a large number of sounds that we then relate to meaning to create words. In contrast, the hierarchy concept suggests humans have language thanks only to highly specialized "hardware" in the brain, which neuroscientists have yet to find.

Research in cognitive neuroscience shows that the same set of brain regions seem to be involved in both sequential learning and language, suggesting that language is processed sequentially. And several recent psycholinguistic studies have shown that how well adults and children perform on a sequence learning task strongly predicts how well they can process the deluge of words that come at us in rapid succession when we're listening to someone speak. "The better you are at dealing with sequences, the easier it is for you to comprehend language," Christiansen said.

The study by Christiansen and his colleagues has important implications for several language-related fields. From an evolutionary perspective, it could help close what has been seen as a large gap between the communications systems of humans and other nonhuman primates. "This research allows us a better understanding of our place in nature, in that we can tie our language ability, our communication abilities, more closely to what we can see in other species. It could have a big impact in terms of allowing us to think in more humble terms about the origin of language in humans," Christiansen said.

The research could also affect natural language processing, the area of computer science that deals with human language, by encouraging scholars to focus on sequential structure when trying to create humanlike speech and other types of language processing, Christiansen said. He pointed out that machines already successfully perform such tasks as translation and speech recognition thanks to algorithms based on sequential structures.

The study, "How hierarchical is language use?" was published Sept. 12 in the Proceedings of the Royal Society B: Biological Sciences. The research was funded by the European Union, the Netherlands Organization for Scientific Research and the Binational Science Foundation

 

Journal Reference:

  1. S. L. Frank, R. Bod, M. H. Christiansen. How hierarchical is language use? Proceedings of the Royal Society B: Biological Sciences, 2012; DOI: 10.1098/rspb.2012.1741

'Psychopaths' have an impaired sense of smell, study suggests

A new study suggests that a poor sense of smell may be a marker for psychopathic traits. People with psychopathic tendencies have an impaired sense of smell, which points to inefficient processing in the front part of the brain. These findings by Mehmet Mahmut and Richard Stevenson, from Macquarie University in Australia, are published online in Springer’s journal Chemosensory Perception.
 

Psychopathy is a broad term that covers a severe personality disorder characterized by callousness, manipulation, sensation-seeking and antisocial behaviors, traits which may also be found in otherwise healthy and functional people. Studies have shown that people with psychopathic traits have impaired functioning in the front part of the brain – the area largely responsible for functions such as planning, impulse control and acting in accordance with social norms. In addition, a dysfunction in these areas in the front part of the brain is linked to an impaired sense of smell.

Mahmut and Stevenson looked at whether a poor sense of smell was linked to higher levels of psychopathic tendencies, among 79 non-criminal adults living in the community. First they assessed the participants’ olfactory ability as well as the sensitivity of their olfactory system. They also measured subjects’ levels of psychopathy, looking at four measures: manipulation; callousness; erratic lifestyles; and criminal tendencies. They also noted how much or how little they empathized with other people’s feelings.

The researchers found that those individuals who scored highly on psychopathic traits were more likely to struggle to both identify smells and tell the difference between smells, even though they knew they were smelling something. These results show that brain areas controlling olfactory processes are less efficient in individuals with psychopathic tendencies.

The authors conclude: “Our findings provide support for the premise that deficits in the front part of the brain may be a characteristic of non-criminal psychopaths. Olfactory measures represent a potentially interesting marker for psychopathic traits, because performance expectancies are unclear in odor tests and may therefore be less susceptible to attempts to fake good or bad responses.”


Story Source:

The above story is reprinted from materials provided by Springer Science+Business Media.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

 

Journal Reference:

  1. Mehmet K. Mahmut, Richard J. Stevenson. Olfactory Abilities and Psychopathy: Higher Psychopathy Scores Are Associated with Poorer Odor Discrimination and Identification. Chemosensory Perception, 2012; DOI: 10.1007/s12078-012-9135-7
 

Dyslexia cause may be different than previously thought

Dyslexia may result from impairment of a different linguistic system than previously thought, according to research published Sep. 19 in the open access journal PLOS ONE.

Speech perception engages at least two linguistic systems: the phonetic system, which extracts discrete sound units from acoustic input, and the phonological system, which combines these units to form individual words. Previously, researchers generally believed that dyslexia was caused by phonological impairment, but results from the current study, led by Iris Berent of Northeastern University in Boston, suggest that the phonetic system may actually be the cause.

"Our findings confirm that dyslexia indeed compromises the language system, but the locus of the deficit is in the phonetic, not the phonological system, as had been previously assumed," says Berent.

In the study, Hebrew-speaking college students had difficulty discriminating between similar speech sounds, but had no problem tracking abstract phonological patterns, even for novel words, suggesting that the phonological system is intact but the phonetic system is compromised.

"Our research demonstrates that a closer analysis of the language system can radically alter our understanding of the disorder, and ultimately, its treatment," says Berent.


Journal Reference:

  1. Iris Berent, Vered Vaknin-Nusbaum, Evan Balaban, Albert M. Galaburda. Dyslexia Impairs Speech Recognition but Can Spare Phonological Competence. PLoS ONE, 2012; 7 (9): e44875 DOI: 10.1371/journal.pone.0044875

Skip the cake? Neural processes at work during self-regulation identified

Almost everyone knows the feeling: you see a delicious piece of chocolate cake on the table, but as you grab your fork, you think twice. The cake is too fattening and unhealthy, you tell yourself. Maybe you should skip dessert.

But the cake still beckons.

In order to make the healthy choice, we often have to engage in this kind of internal struggle. Now, scientists at the California Institute of Technology (Caltech) have identified the neural processes at work during such self-regulation — and what determines whether you eat the cake.

"We seem to have independent systems capable of guiding our decisions, and in situations like this one, these systems may compete for control of what we do," says Cendri Hutcherson, a Caltech postdoctoral scholar who is the lead author on a new paper about these competing brain systems, which will be published in the September 26 issue of The Journal of Neuroscience.

"In many cases, these systems guide behavior in the same direction, so there's no conflict between them," she adds. "But in other cases, like the all-too-common inner fight to resist the temptation of eating the chocolate cake, they can guide behavior toward different outcomes. Furthermore, the outcome of the decision seems to depend on which of the two systems takes control of behavior."

A large body of evidence shows that people make decisions by assigning different values to the various options, says Antonio Rangel, a professor of economics and neuroscience and the senior author of the paper. To make their decisions, people select the choice with the highest value. "An important and controversial open question — which this study was designed to address — is whether there is a single value signal in the brain, or if there are instead multiple value signals with different properties that compete for the control of behavior."

According to the single-value hypothesis, Rangel explains, the ability to say no to the chocolate cake depends on just one system that compares values like healthiness and taste. But the multiple-value hypothesis suggests that there are different systems that process different values. The ability to turn down the cake therefore depends on whether the brain can activate the appropriate system — the one that evaluates healthiness. If you do not want the cake, it means you place a higher value on health than on taste and your brain acts accordingly.

In the study, the researchers asked 26 volunteers to refrain from eating for four hours prior to being tested. During the experiment, a functional magnetic resonance imaging (fMRI) machine was used to measure the brain activity of the hungry participants while they decided how much they were willing to pay for different snacks, which were shown on a computer screen. The items, including foods like chips and vegetables, varied in taste and healthiness. The subjects were explicitly asked to make their choices in one of three conditions: while attempting to suppress their desire to eat the food, while attempting to increase their desire to eat the food, or while acting normally. The volunteers could do whatever they wanted to control themselves — for example, focusing on the taste (say, to increase their desire to eat something delicious but unhealthy) or the healthiness of the item (to reduce that urge).

After a four-second period, the participants placed real bids for the right to buy the items that reflected the value they placed on the food.

The researchers found that activity in two different brain areas correlated with how much the participants said they wanted an item, as indicated by their bids. The two regions were the dorsolateral prefrontal cortex (dlPFC), which sits behind the temples, and the ventromedial prefrontal cortex (vmPFC), which is in the middle of the forehead just above the eyes.

Significantly, the two areas played very different roles in the self-regulation process. When volunteers told themselves not to want the food, the dlPFC seemed to take control; there was a stronger correlation between the signals in this area and behavior, while the signals in the vmPFC appeared to have no influence on behavior. When the volunteers encouraged themselves to want the food, however, the role of each brain region flipped. The vmPFC took control while the signals in the dlPFC appeared to have no effect.

The researchers also found that the brain's ability to switch control between these two areas was not instantaneous. It took a couple of seconds before the brain was able to fully ignore the conflicting region. For example, when a volunteer tried to suppress a craving, the vmPFC initially appeared to drive behavior. Only after a couple of seconds — while the participant tried to rein in his or her appetite — did the correlation between bids and vmPFC activity disappear and the dlPFC seem to take over.

"This research suggests a reason why it feels so difficult to control your behavior," Hutcherson says. "You've got these really fast signals that say, go for the tempting food. But only after you start to go for it are you able to catch yourself and say, no, I don't want this."

Previous work in Rangel's lab showed that when dieters made similar food choices, their decisions were controlled only by the vmPFC. The researchers speculate that because dieters are more accustomed to self-control, their brains do not show the neural struggle seen in the new study. If that is the case, then it may be possible that people can improve their self-control with more practice.

In addition to Hutcherson and Rangel, the other authors on the Journal of Neuroscience paper are Hilke Plassmann from the École Normale Supérieure in France and James Gross of Stanford. The title of the paper is "Cognitive regulation during decision making shifts behavioral control between ventromedial and dorsolateral prefrontal value systems." This research was funded by grants from the National Science Foundation, the National Institutes of Health, and the Gordon and Betty Moore Foundation.


Journal Reference:

  1. Cendri A. Hutcherson, Hilke Plassmann, James J. Gross, and Antonio Rangel. Cognitive Regulation during Decision Making Shifts Behavioral Control between Ventromedial and Dorsolateral Prefrontal Value Systems. The Journal of Neuroscience, 26 September 2012, 32(39):13543-13554 DOI: 10.1523/JNEUROSCI.6387-11.2012

'He says, she says': How characteristics of automated voice systems affect users' experience

 The personality and gender of the automated voices you hear when calling your credit card company or receiving directions from your GPS navigational system may have an unconscious effect on your perception of the technology. Human factors/ergonomics researchers have studied how the gender and tone selected for an interactive voice response system, or IVR, affects its user-friendliness and will present their findings at the upcoming HFES 56th Annual Meeting in Boston.

IVRs have become increasingly popular, particularly with the introduction of mobile technology such as Apple Siri and Iris for Android. Past studies have indicated that users are more responsive to actual human voices than to computer-generated voices, but little research has been completed on the role that voice characteristics play in user perceptions of the technology.

In their upcoming Annual Meeting presentation, "He Says, She Says: Does Voice Affect Usability?" Rochelle Edwards and Philip Kortum conducted a study in which participants interacted with a medical IVR that collected information about their health. Users responded to both male and female voices that spoke in different tones — upbeat, professional, or sympathetic — and then were asked to judge the system's usability.

"We have been systematically looking at what affects user performance on IVRs for some time now," said Kortum. "Voice is the major element in an IVR interface, as graphical elements are for a Web page, and this study was a first attempt to understand the impact voice might have on the perceived usability of such systems."

The authors found that although IVRs with male voices tended to be perceived as more usable than those with female voices, they were not considered more trustworthy. The researchers encourage designers to take voice characteristics into consideration when developing future systems.

"Anyone who uses an IVR knows how frustrating they can be," continues Kortum. "Much of this frustration stems from poorly designed IVRs, not from the form of interface being intrinsically 'bad.' This research shows that some simple modifications to the design of these systems can have an impact on the usability of voice interfaces."

Walking to the beat could help patients with Parkinson's disease

Walking to a beat could be useful for patients needing rehabilitation, according to a University of Pittsburgh study. The findings, highlighted in the August issue of PLOS ONE, demonstrate that researchers should further investigate the potential of auditory, visual, and tactile cues in the rehabilitation of patients suffering from illnesses like Parkinson's Disease — a brain disorder leading to shaking (tremors) and difficulty walking.

Together with a team of collaborators from abroad, Ervin Sejdic, an assistant professor of engineering in Pitt's Swanson School of Engineering, studied the effects of various metronomic stimuli (a mechanically produced beat) on fifteen healthy adults, ages 18 to 30. Walkers participated in two sessions consisting of five 15-minute trials in which the participants walked with different cues.

In the first, participants walked at their preferred walking speed. Then, in subsequent trials, participants were asked to walk to a metronomic beat, produced by way of visuals, sound, or touch. Finally, participants were asked to walk with all three cues simultaneously, the pace of which was set to that of the first trial.

"We found that the auditory cue had the greatest influence on human gait, while the visual cues had no significant effect whatsoever," said Sejdic. "This finding could be particularly helpful for patients with Parkinson's Disease, for example, as auditory cues work very well in their rehabilitation."

Sejdic said that with illnesses like Parkinson's Disease, a big question is whether researchers can better understand the changes that come with this deterioration. Through their study, the Pitt team feels that visual cues could be considered as an alternative modality in rehabilitation and should be further explored in the laboratory.

"Oftentimes, a patient with Parkinson's Disease comes in for an exam, completes a gait assessment in the laboratory, and everything is great," said Sejdic. "But then, the person leaves and falls down. Why? Because a laboratory is a strictly controlled environment. It's flat, has few obstacles, and there aren't any cues (like sound) around us. When we're walking around our neighborhoods, however, there are sidewalks, as well as streetlights and people honking car horns: you have to process all of this information together. We are trying to create that real-life space in the laboratory."

In the future, Sejdic and his team would like to conduct similar walking trials with patients with Parkinson's Disease, to observe whether their gait is more or less stable.

"Can we see the same trends that we observed in healthy people?" he said. "And, if we observe the same trends, then that would have direct connotations to rehabilitation processes."

Additionally, his team plans to explore the impact of music on runners and walkers.

Funding for this project was provided, in part, by the University of Pittsburgh, the University of Toronto, and Holland Bloorview Kids Rehabilitation Hospital.


Journal Reference:

  1. Ervin Sejdić, Yingying Fu, Alison Pak, Jillian A. Fairley, Tom Chau. The Effects of Rhythmic Sensory Cues on the Temporal Dynamics of Human Gait. PLoS ONE, 2012; 7 (8): e43104 DOI: 10.1371/journal.pone.0043104

Hearing brains are 'deaf' to disappearance of sounds, study reveals

Our brains are better at hearing new and approaching sounds than detecting when a sound disappears, according to a study published September 27 funded by the Wellcome Trust. The findings could explain why parents often fail to notice the sudden quiet from the playroom that usually accompanies the onset of mischief.

Hearing plays an important role as an early warning system to rapidly direct our attention to new events. Indeed we often rely on sounds to alert us to things that are happening around us before we see them, for example somebody walking into the room while our back is turned to the door. Yet little is known about how our brains make sense of the sounds happening around us and what makes us hear certain events while completely missing others.

Researchers at the UCL Ear Institute wanted to try and understand what makes certain sounds easily detectable while others go unnoticed. They created artificial 'soundscapes' composed of different on-going sounds and asked listeners to detect the onset or disappearance of different sound-objects within the melee.

Overall, the team found that listeners are remarkably tuned to detecting new sounds around them but are much less able to detect when a sound disappears. In busy sound environments, the participants missed more than half of the changes occurring around them and the changes that were detected involved much longer reaction times. The effects were observed even in relatively simple soundscapes and didn't seem to be affected by volume.

Dr Maria Chait, who led the research at the UCL Ear Institute, said: "On the one hand, we might expect to be more sensitive to the appearance of new events. In terms of survival, it is clearly much more important to detect the arrival of a predator than one that has just disappeared. But this reasoning doesn't apply to other situations. Imagine walking in a forest with your friend behind you and suddenly having the sound of their footsteps disappear. Our results demonstrate that there are a large number of potentially urgent events to which we are fundamentally not sensitive. We refer to this phenomenon as 'disappearance blindness'"

The study also explored how resilient listeners are to scene interruptions. In busy scenes, such as those we often face in the world around us, important scene changes frequently coincide in time with other events. The study showed that even brief interruptions, such as a short 'beep' occurring at the same time as the change, are sufficient to make listeners fail to notice larger scene changes. It is thought that this occurs because the interruption briefly captures our attention and prevents the information about the change from reaching our consciousness.

"Understanding what makes certain events pop out and grab attention while others pass by un-noticed is important not only for understanding how we perceive the world but also has important practical applications. For example, to aid the design of devices intended to help professionals such as air traffic controllers and pilots who operate in environments where the detection of change is critical," added Dr Chait.


Journal Reference:

  1. Francisco Cervantes Constantino, Leyla Pinggera, Supathum Paranamana, Makio Kashino, Maria Chait. Detection of Appearing and Disappearing Objects in Complex Acoustic Scenes. PLoS One, 2012 DOI: 10.1371/journal.pone.0046167