Wednesday, July 1, 2015

What is Stupid? (from Medium @realBenParrish)

Originally Taken from Medium (Benjamin Parrish @realBenParrish )
If you believe you have some sort of cognitive malady thwarting your advancement in the workplace or in an academic environment, listen up. I fancy a bit of neuroscience time and time again, something odd for a 16 year old boy to claim, I know. But I fancy learning about what makes us, us. I relish in learning about what makes us sad, what makes us so enthralled by the scoring of a touchdown by our favorite football team, what makes us able to remember what we had for breakfast this morning, what enables us to build thousand pound rocket ships and venture off to the moon — it’s all neuroscience, the brain and its willful manifestations of desires and ambitions. Or perhaps, the workings aren’t so paramount and vivacious. Maybe it’s watching the latest episode of Dance Moms, or texting that girl you met on the dance floor the other night — what made you hesitate to ask her out? What made you nervous? It’s your brain, my friend. I think we’re finally becoming aware of this 2.5 lb. gelatinous heap stored inside of our craniums with the myriad of news articles depicting every little and last advance in the principal field of neuroscience. It seems that every other day we’re learning a little more about how we work — how the brain works. And this is what I want to bring to everybody’s awareness: for the first time in history the general educated public is aware that the brain is the very center of their beings when before it just their personalities or souls. People now realize that it’s not people that are stupid; it’s their brains that are stupid. “You need to get your brain checked” and “What on earth is going on up there?” have now become commonplace utterances in academic and workplace settings among people who notice some sort of intellectual deficiency in their schoolmates or coworkers. In fact, with the advent of neural imaging techniques like MEG (Magnetoencephalography — read more here), we can see exactly what’s going on inside someone’s head from a scientific standpoint. The question becomes: If intelligence and cognition are the result of microscopic physiological mechanical happenings, are people really to blame for their deficiencies? Shouldn’t we then look at intelligence deficiencies as a medical problem and not an issue of someone’s personality?
If you’re slightly literate or if you have relatives over the age of 60, you should be somewhat aware of a little (synonymous with really flipping big, in this instance) something called Alzheimer’s Disease, the silent killer of over 500,000 people in the United States every year (http://www.nydailynews.com/life-style/health/alzheimer-kills-previously-thought-study-article-1.1712078). Alzheimer’s is what we call a neurodegenerative disease, catalyzing the atrophy of synaptic connections in the brain. Whilst we know very little of the exact mechanics behind consciousness and the brain, it is known that these synaptic connections facilitate most neurological happenings, such as memory and basic thought along with multitude of other cognitive functions. Symptoms of Alzheimer’s cover the spectrum of just about any neurological or cognitive deficit, manifesting on MEG screens as little to very little neuron-activity at all. This results in the loss of memory, the impaired reasoning skills, the instances of delusion, and other plethora of syndrome. Now from a pure neurological standpoint, the images that are displayed on the MEG screen of an Alzheimer’s patient and a person of the typical “stupid” stereotype are not too far off in terms of resemblance, meaning that they both depict a lesser-than-average portrayal of neural synaptic happenings. I am not in any way entertaining the position that people with the unfortunate diagnosis are stupid; I am simply showing the correlation between a person of lesser intelligence without any medically diagnosed neurological calamity and a person that is cognitively handicapped.

Schematic displaying the neurodegeneration that characterizes Alzheimer’s diseease
If people that are in the position of less intelligence in comparison to their peers show decreased cortical activity in their MEG images and Alzheimer’s patients show similar but more drastic neural activity, are the people that are of less intelligence also in a neurological debacle? Both parties show similar symptoms of neurological impairments (in varying degrees of severity, of course) — based on this, form your own ultimatum.
Harvard University and Boston’s Children Hospital recently published a study — a study of greater than 12 years, mind you — that detailed the varying degrees of cognitive and neurological damage inflicted on the minds of Romanian orphan children that were in the horribly unfortunate position of never knowing adoption into loving, and supportive homes (http://www.telegraph.co.uk/news/science/science-news/11370571/Loving-foster-homes-repaired-brain-damage-of-Romanian-orphans.html). The part of the brain that was affected by this ordeal appears to have been the “white matter”, or the part of neural matter that insulates the axon of the neuron, speeding up the inter-neuronal communication protocol by lessening the amount of electrical charge that is lost as it’s en route to the next neuron on the sequence of almost innumerable cells that give rise to consciousness. Symptoms generally manifested much later in life under the visage of subtle intellectual impairments and emotional dysfunction. Much like our brief analysis of Alzheimer’s disease yielded, the MEG scans of these children relayed a similar lack of neuronal activity, much like the people of less intelligence — again, varying in severity. Of course, children that knew the joys and assurances of adoption made full recovery in their neurological quality, just to make that clear. But this begs the question: if these Romanian children would be referred as “stupid” for the rest of their lives due to their neurological injuries — the same label belonging to people of apparent less intelligence — wouldn’t this assignment be considered a medical problem, since physiological and anatomical malformations account for the cognitive impairments?

Romanian Children at Orphanage in Filipesti, Romania
And if these children’s symptoms are akin to the typical “stupid” fellow, wouldn’t both groups be under similar medical conditions but of less severity? If we are noting similar characteristics in typical unfortunately designated “stupid” people and known documented medical predicaments, we should classify both groups as having medically recognized neurological conditions since they both exhibit very similar symptoms. The parallels and congruences in their symptoms should point to this ultimatum.
Of course, the acknowledgement of this dogma in the assignment of mental disabilities to people that mal-perform in an academic setting or just possess the title “dim” or “stupid” evokes the controversy of blaming every little fault in one’s behavior or intelligence on the happenings inside of one’s brain and the neurological processes/mechanics thereof. We arrive then at the crossroads of science and ethics, much like Charles Darwin arrived with his advocacy of Evolution and John Mather with the discovery of background radiation pointing to the occurrence of The Big Bang as the inception of our universe; should we let preemptively established belief get in the way of concretely gathered evidence? As incorrect and unpopular this thesis may sound, it’s what science is pointing towards. It’s up to us if we are to accept or reject this new interpretation of intelligence

Wednesday, September 24, 2014

Its YOUR Brain

The brain is one of the least explored areas of modern science and things are just now getting heated up. Discoveries like the mapping of the cerebrum and the identification of Alzheimer's physiologically have marked monumental progress in understanding that gelatinous lump of gray matter in your head and thankfully, we're only at the beginning. Recent studies reveal a certain malleability to the mind, a state of impermanence that is prevalent in childhood and remaining very well into adulthood. This implies certain implications of course; if the brain is in a constant state of fluctuation can't that be good and also bad? Neuroplasticity, the study of the brain's (and mind's if you're not into that realist mumbo-jumbo) constant shifting and changing, is now at the forefront of neurological research with surmise that it is a process that is the very fabric of the tangible mind. In sum, neuroplasticity can be good and bad just like a lump of clay can be molded into a beautiful sculpture or something resembling my first grade art project. Everyone has this lump of clay, but of course, it is up to us what we decide to do with it.

Neuroplasticity has been utilized by medical professionals for decades as a method of rehabilitation and relearning, even though we have just now become aware of it. Among its many applications, stroke recovery can be seen as the best epitome of Neuroplascity in its benevolence. In 2009, a healthy woman of Florida experiences a stroke and loses feeling and control of much of her right side, speech, and cognitive skills in almost an instant. She is diagnosed with a cerebrovascular insult, a condition that will leave her likely crippled for life. She recovers and advances out of intensive care into a stage of rehabilitation with very dismal hope of ever regaining her former self and brain. But defying expected recovery results, the woman is able to recover her speech, cognitive reasoning, and most of her motor skills thanks to a revolutionary new technique called limb-immobilization therapy first developed in the Silver Spring Monkey experiments  (a matter for a different time). The therapy basically entails the immobilization of a healthy limb via a restraining device, forcing the patient to attempt usage of their disabled arm. The first few trials are not successful; zero motor skills return. But, at around the 17th hour of the bitter failure that entailed most of the experiment, results manifested; the woman began to move her fingers. Finger movements grew into fist-clenching and fist-clenching grew into wrist movement. Eventually at around the 55th hour of treatment, one year after her stroke, the patient regained nearly all of her former motor skills in her affected arm and leg. The reaction from the scientific community was ecstatic and similar trial treatments spawned in recovery centers around the country and later the world. After much study and cerebral mapping, it was discovered that the brain of this woman had anatomically changed; new neurons (preexisting) had worked their way into the formerly dead left motor cortex to bring newfound control to her neurologically dismembered limbs once more. Speaking bluntly, neuroplasticity has enormous potential in the field of stroke rehabilitation and medicine as a whole.

But what does this mean for average, physically-sound men and women like us? It means that we have the power to change arguably the most important organ in our body for our own benefit and use. In fact, learning is actually an example of neuroplasticity; learning involves physical changes in the anatomy of our brain and then our mind (unless you're a realist). So when you're learning, say, the alphabet in your Kindergarden or Pre-K class, your neurons are making physical connections to other neurons (via neural synapse in dendrite-axon connection points) and enabling the learning to occur - connection. But lets say that the reverse happens - you forget. Forgetting is the result of the under-usage in neural activity, meaning that the connections between certain neurons (the ones that recorded where your car keys are) deteriorate. When these synaptic connections deteriorate, you forget; its as simple as that. What I'm trying to get at is that you CAN learn to do anything. There are no bounds that your brain knows besides under-stimulation, the only thing that cases intellectual regression (besides major neurological disorders).

The purpose of this dissertation was to point out that your brain is never set in its ways. You can accomplish anything, acquire any skill, or overcome any obstacle that you may come across. Think about it: a stroke patient with sections of her brain literally dead can recover almost fully to regain her motor skills. If she can do that, can't you do the same thing on a smaller scale? Kids that are struggling in school, people that are having trouble reading: your brain isn't hardwired to be that way. With the right work and the right amount of it, you can enable yourself any set of skills or any set of knowledge that you so desire. Its all up to you.

Its YOUR brain. Do what YOU want with it. 


What the Internet doesn't want you to Know

Upon completing necessary course-work for various academic studies and exploring my own interests, I have encountered several pieces of material that deal extensively with the matter of the Net's impact on the human social, psychological, and intellectual aspects. To say the least, I am very intrigued. When reading Nicolas Carr's The Shallows and Sherry Turkle's Alone Together, both very provocative titles on the effects of the Net on man, one can't help but encounter a certain dread, a certain loathing for the culture that technology has molded us into - a culture that feeds off the gossip of one's mostly fictitious successes and unneeded connectivity. And yet despite of our awareness to the erosion of humanity's social normalcy, we cannot so much as renounce a sliver of our usage of such technologies like Twitter, Facebook, Instagram; products that market themselves as social tools but isolate the user in actuality. Why do we do this? Why are we so infused and inseparable with an aspect of life only as new as 5, 6, 7 years old? Maybe its because these technologies take advantage of something inside of us - an evolutionary aspect of the subconscious human mind that makes us yearn for connectedness and networking.Or is it maybe that we like the efficiency of hand crafting a "virtual me" to show off to the world via Facebook: An online persona that shows our triumphs and the person we WANT to be rather than the person we actually are. This drive for connection and personal esteem tethers us to the unfortunate new culture of the net.

As humans, we crave intellectual stimulation. We crave exploring, wondering, discovering, and encountering. We crave interaction, stimulation, and occupancy; stagnancy is loathed and rejected in today's culture. For hundreds upon thousands of years we have sought this wondering and simulation by means of developing fire, cultivating agriculture, developing written language, composing beautiful art and musical compositions, and even walking on the moon. We wrote and created tools in our free time and we engaged in imaginative novel reading, alongside improving our environments. We were creators of information, interpreters of nature, and explorers of distant lands. What are we doing now? Does our natural curiosity drive us to see who's dating who on Facebook? Or to read about Kim Kardashian's third wedding? There is real damage to this new-age curiosity according to Nicholas Carr, a Dartmouth graduate and revered columnist. Carr describes a situation in which the formally quiet and reflective mind of the pre-information age transitions into a mind of "information hunting and gathering" with the net as the happy hunting ground. If all this information and new data is being streamlined to us with little to no work, don't you think the mind is going to take advantage and exploit that? How many of us have googled a piece of information more than once? How many feet are in a mile? What's the capital of Canada? How to cook hamburgers? Is it to no support that the Net is supplementing if not replacing the human memory capacity? Modern day academia is plagued with this brain-internet swap and its only getting worse. Schools for the first time are letting their students attend class with cell phones on and fully involved in the child's education. The insertion of these technologies has handicapped even the best students of the crowd - these students google information that they received and were expected to learn in class; they don't bother with this learning. I hear of my peers talking of having to google the formula for arithmetic sequences while in calculus class. And guess what? They do it again the next night. The same. Exact. Topic.

But it is not students alone impacted by this sudden implementation of worldwide communication and infinite access to information in their pockets. Adults - well educated, intelligent, affluent, and responsible adults are being molded just as bad as the kids, if not worse. The blackberry and the iPhone have become modern staples of the modern workforce with their implementation knows no bounds. These digital briefcases encompass every single tool a working man may wish to access. Need to ask a colleague about the report you're planning to do? Email's got it. Did you realize that you didn't finish that presentation that's due today while sitting in the Taxi Cab? "The Cloud" and your iPhone have your back. Emails and texts and presentation work and reports and IMing your wife and emailing some more and then texting some more. So while technology has undoubtedly enabled us to do more, to what quality are we doing those things that we are allowed to do more of? I've read countless accounts (and maybe you can testify to this too) of working adults that are so preoccupied with the use of their technology that they lose the task they initially set out to do. And if its actually inhibiting the quality of our productivity, don't you think its time to take it easy on the technology?

As stated above, Nicholas Carr as well as myself fear of the growing erosion of human deep thought. We both fear for the freedom of a drifting mind - a daydreamer and a reflector. These workmen and women on their devices twenty four seven don't have time to daydream and reflect. That time is spent on finishing that presentation in the back of the cab rather than actually stepping back and wandering in one's mind. We have become our machines meaning that most of the time we perform tasks and then perform more tasks. Our leisure time has become using this technology originally intended for work. We have become our machines and I think its time that we step back and try to become a little more human for a change. Free (at least temporarily) of the intellectual dependence we put on the Net and free of being as Sherry Turkle puts it: Alone Together. It is this limitation of technology and added human reflection time that I am advocating and it is my prayer and wish that you would step back, unplug, and think. Just you and your brain: the original human leisure time.