Information

Is there any way a human could whistle and be unable to speak?

Is there any way a human could whistle and be unable to speak?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Is there any situation anatomically, where a human could understand the speech of others perfectly, without any capabilities of speech themselves, but would retain the ability to whistle with a tune?

I was led to believe that the larynx is the most important thing for creating the sounds in speech, however I don't know if a damaged larynx would necessarily not allow someone to speak.

I would suggest someone who has no tongue cannot speak coherently, however I would imagine that the tongue is instrumental (pardon the pun) in creating a range of sounds whilst whistling.

Is there perhaps a part of the brain that could be affected in order to stop someone speaking? Or would this also mean the person could not understand speech either?


Short answer(s)
Someone with a damaged larynx may still speak with the use of a speech aid (electronic larynx).

The ability to understand speech does not necessarily mean one can speak normally. There are neurological disorders where folks can understand speech, but have difficulty producing it.

Background
A full removal of the larynx (laryngectomy) prevents the patient to produce speech because the vocal chords are removed. However, with the use of a voice prosthesis this can be solved (Cancer Research UK).

There are neurological disorders where one can still understand speech, but has impaired ability to generate it, called apraxia. Apraxia can be acquired or congenital (developmental) (NIH). The developmental type is not well understood a the neural level (Dyspraxia Foundation UK). The acquired type is known to occur through damage to specific brain areas, for example due to trauma or stroke. One of these areas associated with acquired apraxia is Broca's area (Graff-Radford et al., 2014), which is a brain area intimately associated with the production of speech .


Broca's area. Source: UC Irvine

However, although damage to this area can result in pure apraxia (i.e., impaired speech but normal speech understanding), it can also result in Broca's aphasia, which means that language understanding is also impaired (American Heart Association). Moreover, regarding your specific question, apraxia is characterized by impaired speech, but not a total lack of speech production.

Reference
Graff-Radford et al. Brain & Language 2014;129:43-6


You can whisper without a larynx. "Patients who have undergone partial or full laryngectomy are typically unable to speak anything more than hoarse whispers, without the aid of prostheses or specialized speaking techniques." (source)

So the remainder of the vocal tract is capable of enough modulation to present typical human speech in many languages.

I expect, however, that the tongue, throat, jaw, and facial muscles figure very heavily into this modulation.

If these muscles become paralyzed or otherwise unusable, it's reasonable to assume that they may still be able to whistle with the use of the lungs and hands or fingers.

Hands free whistling, however, would require at least teeth and lips, or teeth and tongue, or lips and tongue - one surface to control the flow of air over the other surface that produces the whistle. With control over just these few things, one could likely communicate to a reasonable degree in many languages.


Speech is generated by generating a frequency spectrum with the vocal folds, and then filtering it with the upper vocal tract. Whistling is done by blowing air over shaped tongue and lips.

So, give someone a laryngectomy. They still retain the upper vocal tract (and so can still whistle), but they cannot generate the source vibrations to speak. Do it as an adult and they probably understand the speech of others.


It seems that speech can be recovered after total glossectomy, although it's not a short or an easy process.

Here's one case:

a case study, reporting the evaluation and evolution findings of the speech-language pathology rehabilitation of the swallowing and speech functions of a 58-year-old man submitted to total glossectomy in June 2009. After evaluation, the subject was diagnosed with severe mechanical oropharyngeal dysphagia and alteration in speech articulation. Speech rehabilitation used direct and indirect therapies. Indirect therapy focused on oral motor control, sensitivity, mobility, motricity, tonus and posture of the structures adjacent to the resected tongue. Direct therapy used the head back posture maneuver to help the ejection of food into the pharynx. The patient started exclusive oral feeding, except for solid foods, after ten months in treatment. Over-articulation, speed and rhythm exercises were used to improve speech intelligibility. Thus, the results of speech-language pathology intervention were considered positive, and the patient was discharged after a year in treatment.

Speech therapy in total glossectomy - case study, Vieira 2011


It's in the public interest that the law protects whistleblowers so that they can speak out if they find malpractice in an organisation.

As a whistleblower you're protected from victimisation if you're:

  • a worker
  • revealing information of the right type by making what is known as a 'qualifying disclosure'
  • revealing it to the right person and in the right way making it a 'protected disclosure'

'Worker' has a special and wide meaning for these protections. As well as employees it includes agency workers and people who aren't employed but are in training with employers.

Student nurses and student midwives doing work experience as part of an education course or training approved by, or under arrangements with, the Nursing and Midwifery Council also fall within the meaning of worker for these protections.


The Science Of Human Connection And Wellness In A Digitally Connected World

The most precious commodities on this planet are our health, love, and happiness. Regardless of what we accomplish and accumulate in life, we are unable to take it with us.

In the fast paced, consumer driven, social media shared world that we live in today, success and happiness are often defined by the status of what we achieve, and the value of the things that we own.

Everywhere we look, we are inundated with the same message: the measure of our self-worth is directly equal to the measure of our material wealth.

Whether it’s the status car, the trendiest clothes, the luxury home or the CEO title that comes with the envied corner office with a view, these and the many other status symbols of wealth and success seem to forever define our value in our culture today, immortalized by the cinematic perfection of super heroes and super stars, and broadcasted through the perfectly curated lives that bombard us daily by “friends” on social media.

Fueled by equal parts aspiration and expectation, in an entirely odd and unusual way, envy has become the 21st Century’s most enduring economic driver, feeding our most persistent social cravings and endless material consumerism.

In our effort to keep up with all that is expected of us — and expected of ourselves — many of us find ourselves in perpetual motion, filling our days with the hyper-active, turbo-charged, “crazy busy” schedules that keep us struggling to eat healthy, find and maintain balance between our work, busy careers, and all that’s happening in our personal lives. And despite our success, when we achieve it, it seems that quality personal time for ourselves and for nurturing our relationships has become increasingly more elusive.

Psychologists see a pattern in this success driven culture of busyness and the associated “connection disconnection” of an increasingly digitally remote world, and it’s triggering what they say is rapidly becoming a dire epidemic of loneliness. In the elderly, this epidemic of loneliness is known as the “hidden killer.”

With our daily use of email, texting, smart phones, professional and social media, we live in an age of instant global connectivity. We are more connected to one another today than ever before in human history, yet somehow, we’re actually increasingly feeling more alone.

No longer considered a marginalized issue suffered by only the elderly, outcasts or those on the social fringe, the current wave of loneliness sweeping the nation is hitting much closer to home than you might think. And as shocking as it may seem, new research shows that loneliness may now be the next biggest public health crises to face Americans since the rise of obesity and substance abuse.

In fact, loneliness and its associated depression has become downright rampant, even amongst some of the most successful, with studies showing that business executives and CEOs may actually suffer at more than double the rate of the general public as a whole, which is already an astonishing twenty percent.

What’s more, this ever-growing loneliness among the hyper successful is not just a result of the social and professional isolation of living in a more global and digitized world, but rather it’s a “lonely at the top” malaise that’s spreading largely due to the sheer emotional exhaustion of business and workplace burnout.

Science is now sounding the alarm that there’s a significant correlation between feeling lonely and work exhaustion — and the more exhausted people are, the lonelier they feel. This, of course, is made worse by the ever-growing trend for a large segment of professionals who now work mobile and remotely.

Throughout history, human beings have inherently been social creatures. For millions of years we’ve genetically evolved to survive and thrive through the “togetherness” of social groups and gatherings. Today, modern communication and technology has forever changed the landscape of our human interaction, and as such, we often decline without this type of meaningful personal contact. Today’s highly individualistic, digitally remote, and material driven culture is now challenging all of this, as we turn to science to unlock the mysteries of human connection and wellness in a digitally connected world.

Connection of Disconnection

In a world where some of our most personal moments are “Shared” online with “Friends”, business meetings are replaced with digital “Hangouts”, and the most important breaking news is “Tweeted” online in a mere 140 characters or less, today we often seem much more captivated by flashing notifications on our mobile phones than what we’re actually experiencing outside of our tiny 5'.7" screens.

Mobile technologies ushered in by Internet icons like Google, who have literally defined what it means to have “the world’s information at your fingertips”, have no doubt brought us one step closer to truly living in a “Global Village”. However, no matter how small the world may seem to be getting, it now also feels like it’s often becoming a much less personable place to live in as well.

This also means understanding just how much the “connection disconnection” of loneliness negatively impacts our health, and to begin attending to signs and symptoms of loneliness with preventative measures, the very same way we would do with diet, exercise, and adequate sleep.

Dr. John Cacioppo, PhD, is a Professor of Neuroscience and director of the Center for Cognitive and Social Neuroscience at the University of Chicago, and a leading researcher on the effects on loneliness and human health. According to Dr. Cacioppo, the physical effects of loneliness and social isolation are as real as any other physical detriment to the body — such as thirst, hunger, or pain. “For a social species, to be on the edge of the social perimeter is to be in a dangerous position,” says Dr. Cacioppo, who is the co-author of the best-selling book “Loneliness: Human Nature and the Need for Social Connection”, hailed by critics to be one of the most important books about the human condition to appear in a decade.

Loneliness changes our thoughts, which changes the chemistry of our brains, says Dr. Cacioppo. “The brain goes into a self-preservation state that brings with it a lot of unwanted side effects.” This includes increased levels of cortisol, the stress hormone that can predict heart death due to its highly negative effects on the body. This increase in cortisol triggers a host of negative physical effects — including a persistent disruption in our natural patterns of sleep, according to Dr. Cacioppo. “As a result of increased cortisol, sleep is more likely to be interrupted by micro-awakenings,” reducing our ability to get enough quality sleep, that in time begins to erode our overall greater health and well-being.

One of the most important discoveries of Dr. Cacioppo’s research, is the Epigenetic impacts that loneliness has on our genes. In his recent studies, tests reveal how the emotional and physical impacts of loneliness actually trigger cellular changes that alter the gene expression in our bodies, or “what genes are turned on and off in ways that help prepare the body for assaults, but that also increases the stress and aging on the body as well.” This Epigenetic effect provides important clues in improving our understanding of the physical effects of loneliness, and in an increasingly remote and digitally connected world, minding our digital footprint and ensuring that we cultivate real and meaningful relationships with others may hold the key to keeping us healthy and keeping the onset of loneliness at bay.

Social Media’s Alone Together

Worldwide, there are over 2.01 billion active monthly users of social media, and of the 300 million of us in the United States, sometimes it feels like we’ve all just become new “Friends” on Facebook.

With so many of us being “Friends” and so well connected, you’d think that our social calendars would be totally full.

But the sad truth is that for all of the social media friends that we may have out in cyberspace, studies show that social media usage is actually making us less socially active in the real world, and Americans in particular are finding themselves lonelier than ever.

According to a recent study by sociologists at Duke University and the University of Arizona, published by American Sociological Review, American’s circle of close friends and confidants has shrunk dramatically over the past two decades, and the number who say they have no one outside of their immediate family to discuss important matters with has more than doubled, reaching a shocking 53.4% — up 17% since the dawn of the Internet and social media.

What’s more, nearly a quarter of those surveyed say they have no close friends or confidantes at all — a 14% percent increase since we all became so digitally connected.

Looking at the stats, we should ask ourselves, are digital communication technologies and social media platforms like Facebook and Twitter helping us or actually hurting us?

Many experts seem to feel the latter, and see a clear pattern with social media use and the decline in social intimacy, contributing greatly to today’s social and personal breakdown.

In her recent book “Alone Together: Why We Expect More from Technology and Less from Each Other”, MIT Professor Dr. Sherry Tuckle, PhD argues the case that this just may be so.

Dr. Turkle puts forth a host of pretty convincing signs that technology is threatening to dominate our lives and make us less and less social as humans. In Alone Together, she warns us that in only just a few short years, technology has now become the architect of our intimacies. “Online, we fall prey to the illusion of companionship, gathering thousands of Twitter and Facebook friends, and confusing tweets and wall posts with authentic communication.” But this relentless online digital connection is not at all real social intimacy, and leads us to a deep feeling of solitude.

Compounding matters is the added burden of increasingly busy schedules. People are now working very long hours — far more than in any recent history — and many feel that the only way that they can make social contact is online via social media or even online dating apps — which they often feel is faster and cheaper than actually going out for an intimate connection in person. Many even prefer the limited effort necessary to maintain digital friendships, verses live interpersonal relationships, which allows them to feel connected — but actually still remain somewhat disconnected.

This is perhaps ever more apparent with a new generation of Americans who have grown up with smartphones and social media, and as a result, may have even lost some fundamental social skills due to excessive online and social media use.

Dr. Brian Primack, PhD is the director of the Center for Research on Media, Technology and Health at the University of Pittsburgh, and co-author of a study published by the American Journal of Preventive Medicine, which shows that those who spend the most time digitally connecting on social media — more than two hours a day — had more than twice the odds of feeling socially isolated and lonely, compared to those who spend only a half hour per day. While real life face-to-face social connectedness seems to be strongly associated with feelings of well-being, the study shows that this naturally expected outcome seems to change when our interactions happen virtually. These results seemed very much to be counterintuitive — yet somehow this negative outcome is entirely consistent and true.

Dr. Primack’s earlier research on the connection of social media use and depression in young adults seemed to confirm what many already suspected, that our self-esteem can easily take a nosedive each time we log in to a social media network. There is a natural tendency to compare our lives to those we see online, and when we see others seemingly living the life of our dreams, it’s human nature not to feel just a little bit envious. However, if left unchecked, that envy can quickly turn into low self-esteem — and that can quickly spiral into depression. And like a vicious cycle, the more depressed and the lower our self-esteem, the lonelier we feel.

Meanwhile, a recent study found that those who gave up Facebook for even just a week felt much happier, less lonely and less depressed at the end of the study then other participants who continued using it.

The message is clear, that it’s important to use social media in positive ways. It’s a strong reminder of the importance of establishing real and meaningful interpersonal friendships, versus isolating ourselves in the digital social world. Real life interactions help us to build lasting relationships that fulfill our innate human need to form bonds and feel connected.

The solution, experts say, is that we have to begin to recognize the inherent pitfalls of social media and begin to utilize our online time in more positive ways that enhance our relationships — not detract from them. Social media can actually be a positive step toward building a “Global Village”, if we make it so.

It all depends on how we choose to interact online. It’s important to remember this, in our ever-busy quest for success in our increasingly digitally connected lives.

Connect With Your Friends The Old Fashioned Way — Device Free.

I have established really strong boundaries to have device free outings on date nights, also with my friends or if I am having a business meeting. Let me clarify, a device can be present however it must be switched off completely and preferably out of sight.

I have one friend who I visit sometimes. She is unable or unwilling to hear the boundaries that I would like to have regarding our device free get togethers. She is really smart and quite amazing, we will talk for about 10 minutes and we will be having a delightful deep meaningful conversation, and then like a merciless predator she preys on her phone like she going in for the kill and starts in on her social media. She is an addict. I overtly exit “stage left.” She is disappointed that I leave. This is the only way I can train her with regards to having a device free get together. The lengths of conversation have actually gotten longer since I have been doing that. When we go out for dinner she has to leave her phone at her home otherwise she is unable to help herself to her phone. The question I ask her is, “Dinner with Marina or will it be Dinner with your Phone?” She does opt for Dinner with Marina.

Self Love is one of the most important loves of all. When we learn to love ourselves completely, then we can truly love others

Connect with Your Friends & Loved Ones & Disconnect from Loneliness

01. Choose Self Love & Practice Self Love With Regards To How You Want It To Show Up In Your Life.

02. Choose To Be Worthy & Deserving Of Being Loved By Others On Your Own Terms.

03. Choose To Love People Unconditionally With Strong Boundaries.

04. Choose To Love People Unconditionally Without Being Taken Advantage Of.

05. Choose To Celebrate Who You Are.

06. Choose To See Your Value & How Valuable You Are To Yourself & Others.

07. Choose To Have Self Worth & Self Esteem & Positive Self Deserving In All Areas of Your Life.

08. Choose To Be Empathic With Your Friends With Strong Boundaries.

09. Choose To Be A Great Listener.

10. Choose To Be Worthy & Deserving To Be Listened To & Be Heard.

11. Choose To Be A Good Friend Without Being Taken Advantage Of.

12. Choose To Be Respectful, Present and Mindful With Your Friends.

13. Choose To Speak Your Truth With Emotional Intelligence.

14. Choose To Have Confidence In All Areas of Your Life.

15. Choose To Authentically Live Your Own Personal Truth In All Areas of Your Life.


Biology is NOT destiny

Our fear of biology can be paralyzing.

Much of my writing addresses the underlying biological processes that run in the background of things like infidelity, mate selection, and sexual relationships. I like to explain these things, with the goal of normalizing them for people. I have seen people in so much pain, over feelings like jealousy, over the decline of their sexual relationship, etc., that I believe it helps, takes the pressure off so to speak, for them to understand that some of this is just biology, working the way it does.

But, apparently these arguments are perhaps too successful, because people approach me now, concerned that these biological indicators are actually the death knell for their relationships. One man wrote me, saying that his wife showed all the signs I wrote about, and that it must be inevitable that she would be unfaithful. After all, who can stop biology?

Freud wrote that "Biology is destiny," in describing how he believed women MUST be, and how their behaviors and personalities are compelled by the female biology. Today, we know much about the different biological processes involved in gender differences and behaviors. But, we know even more about the complex, unpredictable role of the interplay between environment and biology. In fact, we know enough now, that we are almost back to square one, unable to predict anything with certainty. I once asked a famous evolutionary researcher if he had ever examined his own testosterone levels, or measured the symmetry of his own face. "No," he said, "The value of these data is not meaningful at the individual level. It only has predictive ability at the statistical level, when you predict trends across large numbers of subjects."

I'm toying with the idea of putting together an "infidelity prediction list," including things like, "Does you partner smell sexy to you?" "Is your ring finger longer than your index finger?," "How big are your testicles?" "Is your face symmetrical?" etc., all things that are biological indicators, correlating with the risk of infidelity. But the thing is, we know, even if you end up with every one of these markers, it doesn't govern your behavior. Biology may subtly influence your choices, particularly if you go through life on automatic pilot. But, if you are aware, and conscious, making thoughtful, considered decisions, it is you in charge of your life. Not your cells, genes or gonads.

Our brains allow us the blessing of overcoming biology.

So - afraid that the influences of biology on sexual behavior is compelling you or a loved one towards decisions and outcomes you fear? Worried that biology has doomed you to infidelity, or a failed relationship? Then sit down, and do some examination of yourself, and with those you love. Make some hard decisions, comparing what you want in the short-term, and the long-term. Biology is on the side of the short-term payoff, almost every time. But, the power of the human brain lies in the ability to step beyond biology, and look at long-term consequences, before they happen. Exercise it, or you are indeed doomed to a life ruled by biological predestination.


Could Human Enhancement Turn Soldiers Into Weapons That Violate International Law? Yes

Science fiction, or actual U.S. military project? Half a world away from the battlefield, a soldier controls his avatar-robot that does the actual fighting on the ground. Another one wears a sticky fabric that enables her to climb a wall like a gecko or spider would. Returning from a traumatic mission, a pilot takes a memory-erasing drug to help ward off post-traumatic stress disorder. Mimicking the physiology of dolphins and sled-dogs, a sailor is able to work his post all week without sleep and only a few meals.

All of these scenarios are real military projects currently in various stages of research. This is the frontlines of the Human Enhancement Revolution -- we now know enough about biology, neuroscience, computing, robotics, and materials to hack the human body, reshaping it in our own image. And defense-related applications are a major driver of science and technology research.

But, as I reported earlier, we also face serious ethical, legal, social, and operational issues in enhancing warfighters. Here, I want to drill down on what the laws of war say about military human enhancements, as we find that other technologies such as robotics and cyberweapons run into serious problems in this area as well.

Should enhancement technologies -- which typically do not directly interact with anyone other than the human subject -- be nevertheless subject to a weapons legal-review? That is, is there a sense in which enhancements could be considered as "weapons" and therefore under the authority of certain laws?

In international humanitarian law (IHL), also known as the laws of war, the primary instruments relevant to human enhancements include: Hague Conventions (1899 and 1907), Geneva Conventions (1949 and Additional Protocols I, II, and III), Biological and Toxin Weapons Convention (1972), Chemical Weapons Convention (1993), and other law. Below, I discuss these agreements and what their implications may be for human enhancement.

1. Would human enhancements count as weapons under the Geneva Conventions?

Let's start with the basic requirement that new weapons must conform to IHL. Article 36 of the Geneva Conventions, Additional Protocol I of 1977, specifies:

In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party.

But does Article 36 apply to human enhancement technologies? That is, should they be considered as a "weapon" or "means or method of warfare" in the first place? Unlike other weapons contemplated by IHL, enhancements usually do not directly harm others, so it is not obvious that Article 36 of Additional Protocol I would apply here. If anyone's safety is immediately at risk, it would seem to be that of the individual warfighter -- thereby turning the debate into one about bioethics. To that extent, warfighters, whether enhanced or not, are not weapons as typically understood.

Yet in a broader sense, the warfighter is not only a weapon but perhaps a military's best and oldest weapon. Warfighters carry out missions, they sometimes kill enemies, and they represent one of the largest expenditures or investments of a military. They have cognitive and physical capabilities that no other technology currently has, and this can make them ethical, lethal, and versatile. The human fighter, engaged in hand-to-hand combat, would be the last remaining weapon when all others have been exhausted. So in this basic sense, the warfighter is undeniably a weapon or instrument of war.

Still, should Article 36 be interpreted to include warfighters themselves as weapons subject to regulation? There could be several reasons to think so. First, other organisms are plausibly weapons subject to an Article 36 review. Throughout history, humans have employed animals in the service of war, such as dogs, elephants, pigeons, sea lions, dolphins, and possibly rhinoceroses. Dogs, as the most commonly used animal, undergo rigorous training, validation, and inspections. If a military were to field a weaponized rhino in an urban battlefield that contains innocent civilians, we would be reasonably worried that the war-rhino does not comply with Article 36. If rhinos cannot reliably discriminate friends from foe, e.g., a rhino may target and charge a noncombatant child in violation of the principle of distinction. A similar charge would apply to autonomous robots in such a general environment in which distinction is important, as opposed to a "kill box" or area of such fierce fighting that all noncombatants can be presumed to have fled.

If autonomous robots are clearly regulatable weapons, then consider the spectrum of cyborgs -- part-human, part-machine -- that exists between robots and unenhanced humans. Replacing one body part, say a human knee, with a robotic part starts us on the cybernetic path. And as other body parts are replaced, the organism becomes less human and more robotic. Finally, after (hypothetically) replacing every body part, including the brain, the organism is entirely robotic with no trace of the original human. If we want to say that robots are weapons but humans are not, then we would be challenged to identify the point on that spectrum at which the human becomes a robot or a weapon.

The inability to draw such a line may not be a fatal blow to the claim that humans should be treated as weapons after all, we cannot draw a precise line at which a man who is losing his hair becomes "bald", yet there's clearly a difference between a bald man and one who has a head full of hair. But a simpler solution may be to say that humans are weapons, especially given the reasons offered previously.

As it applies to military enhancements, integrated robotics may be one form of enhancement, but we can also consider scenarios involving biomedical enhancements such as pharmaceuticals and genetic engineering. Again, on one end of the spectrum would stand a normal, unenhanced human. One step toward the path of being fully enhanced may be a warfighter who drinks coffee or pops amphetamines ("go pills" in military-speak) as a cognitive stimulant or enhancer. Another step may be taking drugs that increase strength, erase fear, or eliminate the need for sleep. At the far, more radical end may be a warfighter so enhanced that s/he no longer resembles a human being, such as a creature with four muscular arms, fangs, fur, and other animal-like features. If a war-rhino should be subject to Article 36, then so should this radically enhanced human animal, so it would seem. And to avoid the difficult question of drawing the line at which the enhanced human becomes a weapon, a more intuitive position would be that the human animal is a weapon all along, at every point in the spectrum, especially given the previous reasons that are independent of this demarcation problem.

If we agree that enhanced human warfighters could be properly weapons subject to Article 36, what are the implications? Historically, new weapons and tactics needed to conform to at least the following: (1) principle of distinction, (2) principle of proportionality, and (3) prohibition on superfluous injury or unnecessary suffering, often abbreviated as SIrUS.

To explain, first, the principle of distinction demands that a weapon must be discriminating enough to target only combatants and never noncombatants. Biological weapons and most anti-personnel landmines, then, are indiscriminate and therefore illegal in that they cannot distinguish whether they are about to infect or blow up a small child versus an enemy combatant. Unintended killings of noncombatants -- or "collateral damage" -- may be permissible, but not their deliberate targeting but to the extent that biological weapons today target anyone, they also target everyone. (If they don't target anyone in particular but still kill people, then immediately they would seem to be indiscriminate.) However, future biological weapons, e.g., a virus that attacks only blue-eyed people or a certain DNA signature, may be discriminate and therefore would not violate this principle (but could violate others).

Second, the principle of proportionality demands that the use of a weapon be proportional to the military objective, so to keep civilian casualties to a minimum. For instance, dropping a nuclear bomb to kill a hidden sniper would be a disproportionate use of force, since other less drastic methods could have been used.

Third, the SIrUS principle is related to proportionality in that it requires methods of attack to be minimally harmful in rendering a warfighter hors de combat, or unable to fight. This prohibition has led to the ban of such weapons as poison, exploding bullets, and blinding lasers, which cause more injury or suffering than needed to neutralize a combatant.

However implausible, we can imagine a human enhancement that violates these and other provisions -- for instance, a hypothetical "berserker" drug would likely be illegal if it causes the warfighter to be inhumanely vicious, aggressive, and indiscriminate in his attacks, potentially killing children. (For the moment, we will put aside enhancements that are directed at adversaries, such as a mood-enhancing gas to pacify a riotous crowd and a truth-enhancing serum used in interrogations the former would be prohibited outright by the Chemical Weapons Convention in warfare, partly because it is indiscriminate, and the latter may be prohibited by laws against torturing and mistreating prisoners of war.) The point here is that it's theoretically possible, even if unlikely, for a human enhancement to be in clear violation of IHL.

But let us assume that the human enhancement technologies generally conform to these basic principles. (If they do not, then there's already strong prima facie reason to reject those technologies as unlawful under IHL those are the easy cases that do not need to be examined here.) Given this assumption, are there other, less-obvious international laws that could prohibit military enhancements? Let's examine a few more possible areas of concern:

2. Would human enhancement count as a biological weapon under the Biological and Toxin Weapons Convention?

First, the above discussion on whether enhancements are weapons is relevant not only to Article 36 of Additional Protocol I but also arguably to the Biological and Toxin Weapons Convention (BTWC). The first article of the BTWC states that:

Each State Party to this Convention undertakes never in any circumstances to develop, produce, stockpile or otherwise acquire or retain: (1) microbial or other biological agents, or toxins whatever their origin or method of production, of types and in quantities that have no justification for prophylactic, protective or other peaceful purposes (2) weapons, equipment or means of delivery designed to use such agents or toxins for hostile purposes or in armed conflict.

Whether or not they are properly weapons, are military human enhancements "biological agents" in any reasonable sense? The BTWC is silent on this question, though it does anticipate unforeseen developments in genetic engineering, biotechnology, synthetic biology, and other scientific fields. The usual assumption is that these "agents" are both limited to roughly being microbial in size and to biological substances that are directed at adversaries, not directed to the enhancement of one's own military personnel. This assumption, unfortunately, is not explicit enough in the BTWC that is, it does not define what a biological agent is. As a result, it is still an open question of whether the BTWC applies to human enhancement technologies.

To answer this open question, let's try to better understand what a "biological agent" is. This seems to mean an agent that is biological in nature (e.g., anthrax virus), as opposed to purely chemical (e.g., chlorine gas) or physical (e.g., a falling object) and an agent is a substance or actor employed for some effect or purpose (e.g., LSD is a psychotropic agent). In a broader but consistent sense, agents can be persons too (e.g., a government spy is a "secret agent"). If so, then enhanced warfighters can be agents. Even if we reject this understanding and stipulate that biological agents must be nonperson substances -- an interpretation that is not explicit in the BTWC -- we can still consider the enhancement technology itself as an agent, apart from the warfighter it enhances.

Again, insofar as the BTWC does not specify that biological agents must be of the kind that directly harms adversaries, then some human enhancements -- such as anabolic steroids for increased strength -- would seem to count as biological agents: they are substances employed for some effect and are biological in nature. They would serve "hostile purposes" in that they create a warfighter more capable of defeating adversaries and fulfilling military missions so these enhancements would at least indirectly harm others.

With respect to scale, it is difficult to see why size would matter for the BTWC, which again is not explicit on the issue. If we understand the BTWC to be interested in only microbial-sized agents -- and returning to the position that humans can be agents -- then consider a hypothetical process that can shrink a human soldier to the size of bacteria, such as in the theatrical film Fantastic Voyage: If size matters, then the BTWC would seek to regulate the microscopic soldier, but not the full-sized soldier who has the exact same capabilities. Why the difference in concern here? It may be that the microscopic soldier can be stealthier, infiltrate more places, and so on, but none of these concerns is cited in the BTWC as a motivating reason for regulation.

Related to enhancements, the BTWC arguably would have something to say about bioengineered insects and animals, for instance, that are used as weapons. Like pathogens, insects and most animals do not obey human orders and would therefore be unpredictable and indiscriminate as a weapon -- and tiny attack-insects do not seem significantly different in kind than microscopic organisms also designed for attack. One possible difference is that microorganisms typically harm us from the inside-out, and somehow this could be less humane and more frightening than biting our bodies from outside-in. Yet we can also envision bioengineered animals that operate from the inside-out too, as tapeworms and mosquitoes do (or at least the disease they transmit into our bloodstreams). So if it's not unreasonable to think that bioengineered insects would be subject to the BTWC, then size does not matter for the BTWC, or at least the interest is not limited to microscopic organisms.

As for other qualifiers in the BTWC, some enhancements could be noncompliant in that they have no "prophylactic, protective or other peaceful purposes." A hypothetical berserker drug could be an example: its only obvious function is to make a person a fiercer, rampaging combatant. This is to say that, under some plausible understanding of the BTWC, at least some possible warfighter enhancements could count as "biological agents" and therefore subject to the BTWC. If the BTWC intends or ought to rule out enhancements under its purview, then its language needs to be made more explicit.

3. Could human enhancement violate international humanitarian law because they are "repugnant to the conscience of mankind"?

Contributing to the above problem with the BTWC -- i.e., what counts as a "biological agent" -- is also a lack of specificity on the motivating reasons for the BTWC in the first place. That is, the convention is unclear on why we should want to prohibit biological and toxin weapons. But there are some clues. In the preamble to the BTWC, state parties to the convention declare they are:

Convinced of the importance and urgency of eliminating from the arsenals of States, through effective measures, such dangerous weapons of mass destruction as those using chemical or bacteriological (biological) agents, .

Convinced that such use would be repugnant to the conscience of mankind and that no effort should be spared to minimize this risk,

That is, biological agents, such as highly infectious bacteria or viruses, are difficult to control in their propagation and therefore are indiscriminate to use as a weapon. Anthrax spores, for instance, may be carried by the wind and can infect a child or entire populations just as easily and likely as a soldier. This would be a clear violation of the principle of distinction in IHL.

If this were the only motivating reason for the BTWC, then perhaps we can conclude that human enhancements are not the biological agents that the convention intends to address enhancements generally are not infectious or "weapons of mass destruction." But this cannot be the only reason. In its categorical prohibition of biological and toxic weapons, the BTWC does not distinguish between infectious and noninfectious ones. For instance, a poison dart that can be used only once in a precisely targeted attack would still be banned, even though it is not a weapon of mass destruction, given that it is a toxin and especially if there were no "prophylactic, protective or other peaceful purposes" for the poison.

To explain why the prohibition is categorical, we can examine the next clue, that the BTWC is motivated by "the conscience" of humanity. That is, some methods of killing are more insidious and repugnant than others. Biological and toxin weapons, then, are of special concern, because they are usually silent, invisible, and indiscriminate ways of killing people -- often with horrific, painful medical symptoms over the course of several days or weeks.

But is any of this relevant to human enhancements? Again, enhancements usually do not directly harm others, much less kill people in "repugnant" ways. Even if we say that enhancements indirectly harm others, they do not typically do so in ways more repugnant than conventional means, since an enhanced warfighter is still bound by IHL to never use certain weapons and tactics against adversaries.

Like the "weapons of mass destruction" clue, that a biological agent is "repugnant to the conscience of mankind" also does not seem to be a necessary requirement, just a sufficient one. Consider that some poisons or pathogens may kill quickly and painlessly, such as those administered in death-penalty executions: They seem to be much more humane than conventional means, such as shooting bullets and dropping bombs that render an adversary hors de combat through massive, bloody injury to human bodies and brains. Nevertheless, these "clean" poisons are prohibited by the BTWC and elsewhere, such as the Hague Conventions. So, even if human enhancements are not repugnant in the same ways that anthrax or arsenic may be, and even if they are not weapons of mass destructions, they could still fall under the authority of the BTWC, again since the convention is not explicit on its motivating reasons.

In any event, enhancements could be repugnant in different ways. We previously mentioned the possibility of creating a "berserker" drug, as well as a warfighter so enhanced that s/he no longer resembles a human being, such as a creature with four muscular arms, fangs, fur, and other animal-like features. If this sounds far-fetched, we need only look at the history of warfare to see that intimidating adversaries is a usual part of warfare. From fierce Viking helmets, to samurai armor designed to resemble demons, to tigers and sharks painted onto warplanes, to ominous names for drones (e.g., "Predator" and "Reaper"), scaring adversaries can demoralize and make them easier to defeat. This suggests that it may not be so irrational nor inconsistent with customary practices to design enhancements to be inhuman and therefore perhaps inhumane.

Further, biomedical research is presently ongoing with "chimeras", or animals composed of genes or cells from other organisms not involved with the reproduction of those animals. These may include animals created with human genes, for instance, in order to grow transplantable organs in vivo and for research to find medical cures. Manipulation of human embryos, too, can lead to human-animal chimeras, though this possibility has caused much ethical concern and debate, so much so that U.S. legislation -- Human Chimera Prohibition Act of 2005 -- had been proposed to prohibit this line of research, calling it an affront to human dignity as well as an existential threat.

Not all enhancements, of course, are as fanciful as a human-chimeric warrior or a berserker mode, nor am I suggesting that any military has plans to do anything that extreme. Most, if not all, enhancements will likely not be as obviously inhuman. Nonetheless, the "consciousness of mankind" is sometimes deeply fragmented, especially on ethical issues. So what is unobjectionable to one person or culture may be obviously objectionable to another.

Something as ordinary as, say, a bionic limb or exoskeleton could be viewed as unethical by cultures that reject technology or such manipulation of the human body. This is not to say that ethics is subjective and we can never resolve this debate, but only that the ethics of military enhancements -- at least with respect to the prohibition against inhumane weapons -- requires specific details about the enhancement and its use, as well as the sensibilities of the adversary and international community. That is, we cannot generalize that all military enhancements either comply or do not comply with this prohibition.

Beyond the BTWC, inhumanity as a prohibitory reason is a common theme that underlies IHL. In the preamble to the first Hague Convention in 1899:

Until a more complete code of the laws of war is issued, the High Contracting Parties think it right to declare that in cases not included in the Regulations adopted by them, populations and belligerents remain under the protection and empire of the principles of international law, as they result from the usages established between civilized nations, from the laws of humanity and the requirements of public conscience.

Known as "the Martens Clause", this basic principle is found throughout the laws of armed conflict, such as the Geneva Conventions and its Additional Protocols and opinions issued by the International Court of Justice. As one would expect, much debate has occurred on what the "laws of humanity" and "requirements of public conscience" are, especially related to the actual or even threatened use of nuclear weapons. And the same debate could be applied to emerging technologies, most notably in a recent report by Human Rights Watch on attack drones.

I won't engage that lengthy and unresolved debate here, except to note that a prohibition against inhumane weapons and methods is a fundamental principle, sometimes explicit and sometimes implied, that underwrites the laws of war and therefore relevant to an ethics assessment of military enhancements. This is also to say that an ethics assessment of new weapons, such as military enhancements seems to be legally required by IHL, at least in the context of the Martens Clause if not also Article 36 of the Geneva Conventions, Additional Protocol I.

4. How will human enhancement redefine the ethical limits on how combatants may be treated?

The concept of inhumanity is important to clarify, not just for the legal evaluation of weapons but also for the ethical limits on how combatants may be treated. The prohibition on torture, for instance, presumes certain facts about the human condition, such as the kinds of treatment that cause pain, how much pain a person can withstand, how much sleep a person needs, and so on. For instance, if our tolerance for pain could be dramatically elevated, then what counts as torture today may no longer be so -- and therefore such behavior may become morally permissible.

More generally, ethics itself also presumes a similar set of facts about the human condition, such as that we are fairly susceptible to being killed. These facts inform our ethics, for instance, when self-sacrifice is permitted or prohibited and, again, what kinds of action toward others are unethical. If we change these presumed facts about human bodies and minds, then ethical prohibitions and permissions may also be affected. This gives us reason to believe that an ethical code of behavior for robots could very well be different from how humans ought to behave for instance, robots -- to the extent that they have no instinct for self-preservation, cannot feel pain, etc. -- may be permitted to sacrifice themselves in more trivial scenarios than human ethics might allow.

At the beginning of this report's section, I suggested that there is a continuum from a fully human animal to a cybernetic organism to a fully robotic machine. This spectrum is perhaps defined by how many human body parts we replace with mechanical ones, ranging from zero to all. Enhanced warfighters, then, could fall somewhere in the middle of this continuum. If "robot ethics" is different from human ethics, at least where relevant facts about humans and robots differ, then it seems that "cyborg ethics" too would diverge from human ethics where there's a relevant difference in the construction and abilities between cyborgs and humans. Though not all enhanced persons are cyborgs, e.g., if the enhancements are genetic, pharmacological, or otherwise not robotic, we can also reasonably conclude that ethics for enhanced persons generally may be different from the standard human ethics.

So it becomes an interesting question of whether it would still be illegal or inhumane to whip a prisoner of war, or deprive him of food or sleep, if the individual can better withstand a whipping or does not have the same food or sleep requirements that normal people typically do. These actions possibly would not cause pain or suffering, or at least as much of it, to the enhanced subject therefore, it would be difficult to count those actions as torture.

Beyond prisoners of war, questions about inhumane treatment could be directed at how we treat our own enhanced warfighters. For instance, drill sergeants may be tempted to push an enhanced soldier harder than other ones without augmented strength and endurance, and perhaps reasonably so. But where there are prohibitions on what military trainers are permitted to do, we may need to reevaluate those rules where an enhancement might change the presuppositions about human limits that motivated those rules in the first place.

The above discussion certainly does not exhaust all the legal issues that will arise from military human enhancements. In our new report, funded by The Greenwall Foundation and co-written with Maxwell Mehlman (Case Western Reserve University) and Keith Abney (California Polytechnic State University), we launch an investigation into these and other issues in order to identify problems that policymakers and society may need to confront.

Beyond IHL, we also examine in the report US domestic law, military policy, bioethics, and risk assessments and then we offer a new framework for evaluating human enhancement technologies in a military context. As an initial model, we also discuss further considerations -- related to virtues, emotions, as well as broader social impacts -- that can be integrated into this evaluative framework. (This essay is adapted from that report.)

Given a significant lag time between ethics and technology, it is imperative to start considering their impacts before technologies fully arrive on the scene and in the theater of war. Consider, for instance, the explosion in number of robots in war: in its invasion of Iraq, the US had zero ground robots in 2003 and suddenly about 12,000 in 2008 and its inventory of aerial robots multiplied by 40-fold between 2002 and 2010. This report, therefore, is intended to anticipate ethical, legal, and policy surprises from new technologies, which -- in the case of military drones -- has already led to international outcry, as well as harm to reputation and real lives. With emerging human enhancements, we can think first before we act.


UFOs and the Strange Disappearance of Gerry Irwin

On February 28, 1959, a Private First Class Gerry Irwin was on his way back to his duties as a Nike missile technician at Fort Bliss in El Paso, Texas, after having just enjoyed a one-month leave in Nampa, Idaho. On this evening he was passing through a portion of southern Utah, near Cedar City, Utah, when there was a bright flash above and he could make out a brilliant white light traveling across the night sky. It was apparently so bright that it lit up the desolate landscape around him, and Irwin actually stopped his car and got out to track its progress across the star-filled sky until it disappeared behind a ridge. At this point he was sure it had to be an aircraft going down, and it had been so alarmingly low that he was convinced he had witnessed a plane crash. He got in his car and scrawled out a note reading “Have gone to investigate possible plane crash, please call law enforcement officers,” which he left on his steering wheel, and he then wrote “STOP” across the side of his car with shoe polish before venturing out over the scrub-filled area towards that ridge and the unknown. And so would begin one of the earliest and weirdest alien abduction cases on record.

Thirty minutes passed before any other car came along on this remote stretch of road, and it happened to be a fish and game inspector, who found the note and quickly informed the Cedar City Sheriff’s Office. When authorities arrived they carried out a search of the area, and while there was no trace of a crashed plane they did find an unconscious Irwin sprawled out of the ground. He was totally unresponsive, unable to be revived, and was rushed to the nearest hospital, where it was found that there seemed to be nothing physically wrong with him other than the fact that nothing could seem to raise him from his stupor, remaining completely unable to be woken up.

Concerned medical staff did what they could and allowed the man to sleep, during which time we was allegedly heard by nurses to talk in his sleep and mumble what sounded like “jacket on bush.” The next morning he came to, and the first thing he wanted to know was if there were any survivors of the airplane crash. Doctors informed Irwin that there had been no plane crash, which seemed to surprise the young man, and he also seemed to really want to know where his jacket was, which he had not been wearing when he was brought in and had not been found by the search party. Although he was physically uninjured, Irwin claimed to remember absolutely nothing from the time he had left his car, and could not in any way explain how he had come to be passed out in the wilderness. He would be kept for a few days for observation, after which he was told that he was suffering hysteria of some sort and sent on his way. This would be just beginning of the strangeness.

In the days after his release, Irwin experienced several episodes in which he would just faint and fall to the ground. The first incident of this was as he was on duty on base, which he snapped out of quickly, and then it happened again as he was walking down a street in El Paso, this time putting him in the same deep state of sleep as before. This caused him to be brought in for another round of medical check-ups and psychiatric evaluations. Interestingly, after this second fainting he seems to have been reset somehow, waking up to ask about any survivors of the plane crash and thinking it was still February 28, as well as not recognizing the same medical staff who had tended to him before or remembering the past two weeks, and this was enough to have him kept for observation for a full month, during which time his memory of the preceding two weeks came trickling back to him. He would eventually be released again with a bill of good health.

Although he seemed normal, his thoughts were constantly invaded by the urge to return to the site where he had originally left his car in this strange odyssey, and so one day he was overcome by such a compulsion and left the base to go out to that lonely road to have a look around, apparently finding his jacket perched upon a bush untouched. Oddly there was a pencil with a noted wrapped around it jammed into one of the button holes, but he never did read it, instead overcome with the urge to burn the note instead. He then apparently snapped out of his daze and realized that he had not left the base in a legal fashion, and was basically AWOL.

He did the right thing, turning himself in, and was subsequently extensively questioned, strangely with large holes in his memory still missing and unable to explain what had possessed him to do what he had done. He was not jailed or disciplined severely, but it was enough to get his security clearance revoked and to be reassigned. Things take a turn for the odder when not long after this he one day just failed to report for duty, and indeed no one had heard from him. Nothing was found missing at his residence and he seemed to have just stepped off the face of the earth. A month later Irwin was still nowhere to be seen, and the lore goes that indeed he has not been seen since. The case would go on to become highly investigated by UFO researchers, and speculation swirled as to what actually happened to him. The main idea was that he was abducted by aliens and then slowly lost his mind, perhaps even be abducted one final time, this time for good. Other ideas included that he was part of some sort of government mind control program or that he simply wandered out into the wilderness to die.

Yet things got really interesting when researcher David Booher investigated the case for his book No Return – the Gerry Irwin story – UFO Abduction or Covert Operation? which has all sorts of interesting information contained within its pages and is one of the most comprehensive write-ups on the case there is. Booher apparently was able to actually track the “vanished” Irwin down, only to find that the man did not seem to remember much of anything of what had happened in 1959, but he was able to manage to shed some light on it all. He claimed that he did disappear for while to live out in the woods, but that he had finally been tracked down and disciplined, after which he was reassigned and sent to Germany, never at any point actually being discharged from the military. He also said that he was long beset by strange blackouts and behavioral issues, and that he had been involved in some top-secret assignment in Austria, although details remain vague. It is interesting to see that while in a lot of literature Gerry Irwin is reported as having vanished into thin air to never be seen again, he was very much alive and not vanished when Booher found him. What significance does this have on the whole strange case as a whole?

In the end it is all very perplexing. We have this military man who was suddenly found unconscious in the wilds after seeing a UFO, the missing time and lost memory, the sudden urge to just go AWOL, the disappearance that never seems to have been properly followed up on, the sending away of Irwin to Germany, almost as if to get him out of the way, and the strange blackouts and disrupted behavior. Is this someone who just lost his mind, an alien abduction, a strange government experiment, or what? Whatever the answers may be, the Gerry Irwin case has managed to gain a reputation as being one of the very first publicized “alien abduction” accounts, and has remained much talked about to this day.


To Be Human

In the 2009 scifi drama Moon, actor Sam Rockwell plays astronaut Sam Bell, who, in the futuristic world of 2035, is responsible for maintaining operations at Sarang, a remote mining facility located on the far side of the moon. Bell looks forward to returning home as he nears the end of a three-year solitary stint mining and shipping Helium-3, an alternative fuel used on Earth. However, when Bell accidentally crashes his lunar rover into a harvester and is rendered unconscious, the onboard system computer, GERTY, believing Bell to be dead, activates another astronaut, sleeping in hibernation, to replace him. Bell manages to survive the crash and, returning to the space facility, finds that the new astronaut activated to replace him is his identical clone.

In time, the two men discover that they are both clones of the original Sam Bell, manufactured to avoid hiring costly new astronauts. They find that they are only copies of the original Sam Bell, complete with detailed memories of his life, loves and family back home on Earth. The film asks us to consider: What does it mean to be human?

This question is difficult to answer since, according to Illuminati whistleblower Donald Marshall, it can be hard to tell if you are a clone or not, since as a clone, your body, memories, and feelings seem real as real. He claims to have been an unwilling participant in top-secret cloning projects since he was a young child. He states that all world governments have engaged in covert human cloning for many, many decades. He reports that he was cloned by members of a powerful organization known as the Illuminati.

Marshall explains that he was activated as a clone against his will numerous times and forced to attend secret meetings in cloning centers, located in deep underground military bases. He says that members of the Illuminati feel that they can use your clone in any way they choose since they believe that clones are not natural-born humans, and therefore, have no human rights.

In truth, many people don't realize the fact that they don't actually own their DNA. As of 2013, the U.S. Patent and Trademark Office have assigned the ownership of a variety of human genetic codes to private corporations and universities in the United States.

According to editor Mike Adams of NaturalNews.com, more than 4,000 human genes so far, up to 20% of our complete genetic code is owned by someone else. The corporation that owns the most patents, Incyte, a biopharmaceutical company based in Delaware, owns patents on over 2,000 human genetic codes. While most people would not be willing to sell their DNA, it is now legal for dozens of private corporations to claim ownership over every cell in our bodies, and those claims would be completely upheld if challenged in a U.S. Court of Law.

One can only wonder what the future will hold, if private corporations have the legal permission to use DNA to clone any member of the population for their own purpose.

According to Donald Marshall, human cloning is immoral, and needs to be stopped, as it is in direct violation of our inherent human rights.


No way to stop human population growth?

An asteroid impact that wiped out hundreds of millions of people would barely slow down human population growth. That’s one of the surprising results of a new computer model, which still finds that there may be a couple of things we can do to keep our numbers in check.

Every dozen years or so, we add another billion people to the planet. If the trend continued, we’d eventually run out of food and water, and we’d be unable to handle the massive amounts of waste and pollution we produce. Yet we know that population growth is already leveling off due to a combination of family planning programs and education for women. Is it possible to slow population growth even more in the next few decades? Corey Bradshaw decided to find out.

Bradshaw, a population biologist at the University of Adelaide in Australia, studies population ecology in animals. But when he gives talks at scientific meetings on declining biodiversity, audience members increasingly ask, “What about the elephant in the room? What about human population size?” he says. “I’ve modeled changing populations in other species for years,” he says, “but I never applied [those models] to human beings.”

So Bradshaw and University of Adelaide climate biologist Barry Brook decided to see how much momentum the human population has. They also wanted to see how sensitive population growth is to factors like mortality and fertility. The duo obtained data on death rates, average family size (i.e., fertility), and regional population size from the World Health Organization and the U.S. Census Bureau International Data Base. They created a computer model that projects human population growth from 2013 to 2100. They added variables to the model that they could modify to create different scenarios. Their goal was to assess how sensitive human population growth is to changes in mortality, life span, family size, and a mother’s age when she has her first baby.

The team then created 10 scenarios, including a “business-as-usual” scenario in which death and fertility rates stayed the same as they were in 2013. The other scenarios projected the effects of alterations such as longer life spans, mothers having their first children at older ages, the imposition of a global one-child policy, and catastrophic deaths due to war or pandemics. Using the regional data, the researchers also examined the effects of population growth on biodiversity hotspots in different parts of the world.

The business-as-usual model matched U.N. projections of 12 billion people by 2100, giving the researchers confidence in their model. But they also saw booming population growth even when they introduced global catastrophic deaths of up to 5% of the population, the same seen in World War I, World War II, and the Spanish flu. When the computer model population lost half a billion people, the total population was still 9.9 to 10.4 billion people by 2100, the team reports online today in Proceedings of the National Academy of Sciences. “It actually had very little effect on the trajectory of the human population,” Bradshaw says.

Some economists argue that shrinking populations create an unsupportable burden of elderly dependents that leads to economic collapse. But the team’s model showed otherwise. When the population is growing, more of the dependents are children, and when the population is shrinking, more are older adults, the model indicates. A dependent is always supported by 1.5 to two workers. The idea that shrinking populations cannot support older adults is a “fallacy,” Bradshaw says.

Two factors did have an impact on human population growth: eliminating unwanted pregnancies, which make up about 16% of all live births, and adopting a global one-child policy. Eliminating those births year after year resulted in population sizes in 2050 and 2100 that are comparable to those produced with a global one-child policy—about 8 billion and 7 billion, respectively.

The models also confirmed that the worst human impacts on biodiversity hotspots will occur in Southeast Asia and Africa, which by 2100 will likely have the highest human densities in the world. Pressures in those parts of the world, Bradshaw says, will be higher than anywhere else in the world. Elephants, rhinos, and lions will likely disappear faster. “So, will my 7-year-old daughter ever see an elephant in Africa unless I get her there very quickly?” Bradshaw says. “I don’t know.”


Novavax Vaccine 100% Effective Against Both Moderate and Severe COVID-19

The U.S. is likely soon to have a fourth vaccine approved for the fight against COVID-19. Novavax just released the results of its Phase 3 clinical trial: Its two-dose vaccine demonstrates 90 percent overall efficacy and 100 percent protection against both moderate and severe COVID-19 disease. The doses are injected three weeks apart.

The company reports that 77 cases of COVID-19 were observed in its clinical trial involving nearly 30,000 participants. Of those cases, 63 occurred in the placebo group and 14 in the vaccine group. "All cases observed in the vaccine group were mild as defined by the trial protocol," notes the company's press release. "Ten moderate cases and four severe cases were observed, all in the placebo group, yielding a vaccine efficacy of 100% against moderate or severe disease." The vaccine's side effects were generally mild.

The Novavax vaccine uses a technology similar to hepatitis and pertussis vaccines, in which copies of viral proteins provoke the immune system to create antibodies that protect people when they are exposed to the actual viruses. In this case, Novavax employs the coronavirus spike protein that the virus uses to infect human cells.

More good news: The vaccine is highly effective against the more transmissible COVID-19 Alpha (B.1.1.7) variant first identified in the U.K., and it is somewhat effective against the B.1.351 (Beta) variant first identified in South Africa.

The next big step is to ask the Food and Drug Administration to approve the drug. Novavax plans to apply for that in the third quarter of this year. The company claims that once the vaccine is approved, it can reach manufacturing capacity of 100 million doses per month by the end of the third quarter and 150 million doses per month by the end of 2021.

Before being thawed out for administration, the mRNA vaccines developed by Pfizer/BioNTech and Moderna must be shipped at ultra-cold temperatures. The Novavax vaccine is stored and stable at 2° to 8°C, which makes it easier to distribute through existing vaccine supply chain channels. It could thus play a significant role in abating the ongoing pandemic in the poorer parts of the world.

Ronald Bailey is science correspondent at Reason.

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

If it doesn’t give the person a heart condition are you really getting the biggest bang for your buck


Wild Things: Strange Cases of Real Feral Humans

What makes us human? Are we born this way, or are we shaped and molded into what we are by society, with our own animal instincts subdued by our social mores? Indeed what happens when we strip away that touch of civilization and human upbringing? Throughout history have come tales of mysterious people who may shed light on the answers to these questions, in the form of what have come to be called feral humans. These are people who for whatever reason have grown up without human contact, in some cases raised by wild animals, and seem to have shed many of the hallmarks of what we consider to be “human.” In the absence of the influence of civilization they seem to have crossed over that barrier that we like to think separates us from wild animals, and in the process have become in a sense wild animals themselves. These are cases that offer an intriguing glimpse into the lives of people who were raised outside of the world of humans and a look at what perhaps makes us human.

One of the earliest accounts of a feral human is the somewhat well-known tale of a strange individual who would come to be known as Wild Peter. In 1724, some hunters in a rural area near Hamelin, Germany were startled to see a boy emerge from some thick woods on all fours, who was described at first as being “a naked, brownish, black-haired creature.” They attempted to coax him closer so he could be captured, but the boy darted back off into the forest with amazing speed. The strange boy, estimated to be around 12 years old, had apparently been haunting the area for some time, always seen on all fours and allegedly able to climb trees with ease. He was never heard to talk, but growled and chuffed readily.

The child was eventually captured by the hunters and shown to King George I, who had been in the area on a visit to Hanover. The king was fascinated by the strange wild child, who seemed to display no knowledge of speech, walked about on all fours snarling, and would eat only raw vegetables and meat, as well as whole birds that he would tear apart, refusing to eat any bread or cooked food whatsoever. The untamed child showed a particular fondness for sucking the sap out of various twigs and branches, which he would strip of their bark before feasting on their contents. So enthralled with the wild boy was King George I that he had him shipped off to England to be studied by the most well respected academics. It was upon his arrival in England in 1726 that he would come to be known as Peter the Wild Boy, as well as simply Wild Peter.

Peter quickly became somewhat of a celebrity, often shown to visiting nobles at the King’s court, and people would flock to gawk at him as he scurried around on all fours and generally acted like a wild animal. He is said to have had no interest in table manners, he always slept on the floor, and was known to constantly try to pick people’s pockets. Wild Peter also purportedly displayed an almost superhumanly acute sense of hearing and smell. Attempts to educate Peter in general failed. He was unable to be house trained, or to be taught manners, how to read or write, how to sleep in a bed, or to be civilized in any way, and indeed he refused to ever say a single word, instead communicating through growls, snorts, and snarls. Due to the fact that he wandered off on several occasions, Peter was fitted with a brass and leather collar inscribed with his name in order to identify him if he got lost again.

Peter is estimated to have lived to be approximately 70 years-old, and throughout his long life in society was never able to learn how to speak or fully adjust to the ways of civilization, although he seems to have developed a taste for music and would reportedly often hum songs to himself. When he was examined by the Scottish philosopher and judge James Burnett, Peter was described as being able to understand language to some degree, but only able to utter the words “Peter” and “King George.” Peter would eventually pass away on February 22, 1785 and be buried at a cemetery in Northchurch, where he had spent his later years living on a farm.

It has never been revealed just who Wild Peter was, where he had come from, why he had reverted to such a wild state, or how he had ended up living out in the wilds alone. In recent times it has been speculated that Wild Peter perhaps may have suffered from a rare disorder known as Pitt–Hopkins syndrome, which results in behavioral abnormalities, learning disorders, and certain physical characteristics such as his coarse, curly hair, drooping eyelids and thick lips, but it is unknown if this was really the case or not. The mystery of who Wild Peter was and what led him to his wild life will probably never be fully understood.

Portrait of Wild Peter from his later years

Another of the earlier accounts of a feral human is the story of Marie Angelique Memmie Le Blanc, who would go on to become widely known as “The Wild Girl of Champagne.” Marie spent an estimated 10 years wandering through the forests of France surviving on a diet of birds, frogs, fish, and other small animals, as well as leaves, branches and roots, all while fighting off wild animals such as wolves with her trusty club and sharpened sticks. It would later be estimated that she had eked out a living in the wilds in this manner from when she was at least 9 years old, until she was captured at the age of 19 in 1731 by villagers from Songy, in Champagne.

When she was first brought in from the wild, it was immediately noticed that she was quite hairy and dark-skinned, with “claws,” and that her “fingers and in particular her thumbs, were extraordinarily large.” She would use these thumbs to dig through the ground for roots, as well as swing through trees “like a monkey.” She was also known to eat any animals that were given to her raw, such as birds and rabbits, which she would deftly skin with her bare hands, refusing any cooked food. She was very weary and animal-like in nature, refusing to drink from cups but rather leaning over to directly drink from water sources, the whole time glancing from side to side like an animal, and she purportedly could run extremely fast on all fours. Marie was once taken on a hunting excursion where she proved to be able to easily run down and catch rabbits in this manner. Her only form of communication was a series of shrill squeals and shrieks, and she did not seem capable of normal speech at all.

Interestingly, after years of tutoring and a series of patrons, Marie proved herself to be quite capable of adapting to civilization. Unusual for a typical feral child, she would go on to learn to read and write French quite fluently and to be able to learn etiquette enough to fit into society. For a time she became a nun, before returning to rich patrons and would eventually die rather well-off in Paris in 1775, at the age of 63. The story of Marie Angelique Memmie Le Blanc has been criticized over the years for perhaps being heavily exaggerated or even entirely made up, although other historians, such as French author Serge Aroles, have remained adamant that the tale is likely completely true.

Another early case perhaps an even more famous tale is the case of Victor of Aveyron. In 1787, sightings began to pop up of a naked young boy of around 12 years old wandering through the woods near Aveyron, France. The boy apparently was very fast and seemed to be living alone in the forest, surviving off the land like a wild animal. After years of these sightings, in 1800 the boy was captured near Saint-Sernin-sur-Rance, France. Upon examination, Victor, as he was to be called, was found to be covered with a variety of scars, suggesting he had had a rough, wild life for years, possibly his entire life. Victor was described as hating to be touched, and violently resisted all efforts to bathe him or to put clothes on him. He was also unable to speak a single word and generally shunned human contact, spending most of his time sulking in the corner in the shadows. On occasion he is said to have displayed sudden, savage outbursts where he would tear about and snarl wildly. Interestingly, he showed extreme resistance to the elements, and in one instance a visiting biologist experimented by putting the naked Victor out into the snow, where he allegedly showed no ill effects or discomfort from the cold whatsoever.

Although Victor was studied extensively by psychologists, philosophers, and scientists, it was never ascertained just where he had come from or how he had devolved into such a wild state. Neither could it ever be found just what sort of cognitive disorders or impairments he may have been suffering from, if any. Throughout his life among other humans, Victor was never able to be taught how to speak or act civilized, although a man who worked with the deaf by the name of Jean-Marc Gaspard Itard spent years working with him and eventually got him to bathe, wear clothes, and understand and respond to simple questions and commands. Victor of Aveyron would eventually die at an institute in Paris at the age of 40, having never really accepted the world of men.

Cases of feral humans go on right into the 1800s and well beyond. In 1845, locals of a rural area near San Felipe, Mexico, were surprised to see a young girl running about on all fours and joining a wolf pack as they descended upon a herd of goats. Around one year after this, what appeared to be the same girl was spotted hunched over a freshly killed goat, eating it. In this case, villagers were able to capture the mysterious wild girl, and she proceeded to howl the entire night exactly like a wolf. Ominously, her cries seemed to have worked, as it was reported that a pack of wolves emerged from the darkness to approach in what can only be described as a sort of rescue attempt. During the ensuing panic, the girl was able to sneak out into the night. She would last be sighted in 1852, when she was spotted apparently suckling two wolf cubs at the side of a river. When she was approached, what has been since referred to as “The Wolf Girl of Devil’s River” allegedly gathered up the cubs and ran off into the wilds, never to be seen again.

In 1867, a group of hunters came across what they originally thought to be a wild animal of some sort curled up and sleeping at the entrance of a cave in the Bulandshahr district of India. When the “animal” awoke, the hunters were startled to see that it was in fact a human boy of around 6 years of age, who scampered off on all fours to cavort about with a pack of wolves. The concerned hunters were able to capture the boy and bring him to the Sikandra Mission Orphanage, in Agra. There he would be given the name Dina Sanichar, although he was mostly referred to as the “Wolf Boy,” and would continue to baffle all who saw him.

Dina slept on the floor, ran about quite easily on all fours, and would rip off his clothes if anyone tried to put them on him. Cooked food was entirely shunned, but he would readily devour raw meat or gnaw on bones. At night he would forlornly howl out into the wilderness, perhaps trying to communicate with the wolf family from which he had been forcefully taken. Dina was also resistant to years of attempts to educate him and teach him language, and he would never learn to speak or write. Just about the only trapping of civilization he took too was tobacco, which he became hopelessly addicted to. He would finally pass away in in 1895 at the age of 34. It has been speculated that the tale of Dina Sanichar may have been the inspiration for Rudyard Kipling’s beloved Jungle Book series.

Stories of feral children continue right on into the 1900s. One of the most famous cases of all is that of the two feral children Kamala and Amala. In 1920, a Reverend Joseph Singh was surprised to spy what looked to be two very young children within a wolf den carved out under an abandoned anthill in the wilds near Midnapore, west of Calcutta, India. With the help of nearby villagers, Singh was allegedly able to rescue the two girls from the wolf den, shooting the mother wolf in the process. The children were found to be a girl of 8 years old, who would be called Kamala, and another child of a mere 1 and a half years old, who was named Amala.

The girls were filthy and “hideous looking,” running about on all fours and described as looking less than human. Their arms and legs seemed to be deformed somewhat from their quadrupedal lifestyle, exhibiting shortened tendons and joints that seemed to make standing up nearly impossible for the children. Other physical anomalies reported by Singh were unusually elongated canines and very muscular, misshapen jaws. Their eyes were said to be reflective in the dark, like those of a nocturnal predator. Kamala and Amala reportedly slept curled up on the floor next to one another, and when they were awake they would pace, growl, snarl, and eat nothing but raw meat, which they would eat from a bowl on the ground. They also avoided humans, often attempting to bite or scratch those who came near them and absolutely refused to be bathed or dressed. They were purportedly mostly nocturnal, had terrific night vision, and displayed incredibly acute senses of smell and hearing. Although they were always naked, the children seemed to show no discomfort in the cold, and indeed tended to not show any sort of emotion at all other than fear. At night, they would constantly howl like wolves. They possessed no apparent ability to speak or even understand what was said to them.

Sadly, little Amala did not last long in this new environment, and died of a kidney infection on September 21, 1921. Kamala, who showed no sign of mourning or emotion on the passing of her companion, would eventually fare much better, becoming more approachable and even learning to walk upright at times, although she would revert to all fours if she wanted to go somewhere quickly and clearly preferred this mode of locomotion. She also learned around 50 words of language and gained a rudimentary understanding of what was said to her. Kamala even learned to stomach cooked food to some degree, although she much preferred it raw. However, she never did fully adjust to civilized life, and died of tuberculosis on November 14, 1929 at the age of 17 years old, still quite wild in her ways.

The case of Kamala and Amala has gone on to become one of the most famous accounts of feral children there is, and at the time it was widely covered in the media. However, the case has come under a good amount of criticism in recent times that casts doubt over how much of it is actually true. For one, Singh’s diaries that he claimed to have kept every day during the time were found to be written years after the fact. There is also the point that there are few if any independent sources that can corroborate what Singh claims, and we only have what he says happened in his first hand accounts. Adding to the confusion is the existence of a variety of versions of the story, such as the one in which Singh was given the two girls by a man who lived in the jungle and kept them in a cage, rather than rescuing them himself. I would also add that some of the elements of the story seem a bit farfetched, such as the detail given by Singh that the girls displayed reflective eyes at night like a cat’s. This seems like a rather sensationalist detail considering that humans simply don’t have a tapetum lucidum, which is the part of the eye in some animals that causes this effect. No amount of living in the wilderness is going to cause a human being to spontaneously evolve these. Regardless of how authentic the tale is, the case of Kamala and Amala has become one of the most widely known cases of purported feral children.

Sharing some similarities to this case is the story of a boy who was found in 1972 playing with a group of wolf cubs in the forest of Musafirkhana, about 20 miles from Sultanpur, in India. The boy was estimated as being about 4 years old, and exhibited some unusual physical traits. His skin was incredibly dark, practically pitch black, his hair thick and matted, his knees, elbows and palms were covered in rough callouses, his nails were long and hooked, and his teeth were reportedly sharpened to points like fangs. The boy readily bonded with dogs and jackals, but would not let humans approach him, and he would eat only live poultry, which he would kill and eat raw, showing a particular liking for their blood. The boy, who would be known simply as Shamdeo, was nocturnal and never walked upright. Shamdeo eventually learned to eat cooked food, and although he never spoke he was able to pick up some sign language. In 1978 he was renamed Pascal and moved to Mother Theresa’s Home for the Destitute and Dying, in Lucknow, where he would spend his remaining years until his death in 1985.

Wolves are not the only animal said to have brought up humans amongst their own. In 1912 we have the story of the Leopard Boy of India. Within the pages of a 1920 edition of the journal for the Bombay Natural History Society is a curious report written by an EC Stuart Baker. According to the account, in 1912 a 2-year-old boy was carried off into the night by a leopardess in the North Cachar Hills near Assam. At the time it was assumed he had been killed and eaten, but 3 years later hunters killed a leopardess in the jungle and found three cubs, one of which seemed to be a 5-year-old human boy. It was ascertained that he was the very child who had gone missing and he was returned to his family, although he was a bit less than human than when he had left.

The Leopard Boy reportedly ran about on all fours “as fast as an adult man can run,” and could dodge around brush, trees, and other obstacles with fluid ease. On his knees were rough callouses, his toes were bent up almost at right angles, and his hands and feet were also purportedly covered in very tough skin. The boy could not speak, instead issuing guttural growls and snarls, and at first he could not be approached, as he would lash out to bite or fight with anyone who came close. He had a disconcerting habit of running about to catch village chickens, which he would kill with his teeth and bare hands and devour raw. Indeed he refused to go anywhere near cooked food. The Leopard Boy would eventually learn to speak and walk upright, but he would never adjust to his new home and would later go blind from cataracts before his death.

In 1937, a man named George Maranz apparently visited an insane asylum in in Bursa, Turkey, where he was introduced to a girl who was claimed to have been raised by bears and found amongst them. According to the stories, hunters had shot a protective mother bear only to be attacked by what they called a “wood spirit.” It turned out that this wood spirit was a young, human child who displayed decidedly bear-like mannerisms and vocalizations. She was likewise described as being very well-built for her age and extremely vicious, as well as refusing cooked food and sleeping on the floor in a corner. Villagers of the area in which she was found claimed that a bear had kidnapped a 2-year-old child there 14 years prior, and it was widely believed that the girl in the asylum was the very same.

In 1954, a young girl of 5 named Marina Chapman was kidnapped from her village in a remote area of Colombia and was later abandoned by her captors and left for dead in the jungle. She was then adopted by a group of capuchin monkeys, who she lived among for the next 5 years, eating berries, roots, and fruit just as they did, sleeping in holes in trees and moving about and climbing on all fours just like a monkey. She would play in the trees with them and they would groom each other. When she was finally rescued by hunters in 1959, she had mostly lost her ability to speak language, and from her hardships that she would face afterwards one tends to think she would have been better off with the monkeys.

The hunters sold Marina to a brothel and into slavery, after which she escaped and survived homeless on the streets. In later years she would be adopted by a family and find work as a housekeeper before finally moving to settle in Bradford, Yorksire, in the U.K. in 1977. She would go on to be happily married and have children. Marina Chapman would go on to co-author a book with her youngest daughter, Vanessa James, chronicling her feral experiences, titled The Girl With No Name.

From 1960 comes the rather bizarre tale of the “Gazelle Boy” from Syria. In 1960, anthropologist Jean-Claude Auger was on a journey across the Sahara Desert and allegedly heard from the nomads there the tale of a feral child not far away. Auger went to investigate and claimed to have seen a naked child speedily galloping among a herd of gazelles. This mysterious boy was said to have mostly darted about on all fours, but would on occasion assumed a bipedal stance to scan the horizon. The wild child was claimed to react and twitch his ears and facial expressions in a manner very similar to the gazelles around him, and seemed to be jumpy and weary of predators. For food, he was observed to feed off grasses and roots directly with his teeth, similar to the animals around him, and his teeth had apparently become flattened like his large herbivore companions. Unfortunately, Auger was unable to ever get close to the Gazelle Boy, and an attempt in 1966 to capture him in a net attached to a helicopter met with failure. The mysterious Gazelle Boy was never captured and it is unknown who he was or what became of him.

An even more bizarre tale is of an 8-year-old boy named Sujit Kumar, who was found on a road in Fiji in 1978 flapping his arms like wings and clucking like a chicken. When he was brought in he ran about acting like a chicken, squatted atop chairs as if roosting, and would peck at his food rather than use his hands to eat, and indeed he seemed to be unable to even hold a spoon. He didn’t speak, but rather made rapid clicking sounds with his tongue. It was learned later that his parents had locked him in a chicken coop at a very young age, after which his mother had committed suicide and his father had been murdered. He was adopted by his grandfather, who continued to keep him locked up in the coop. In his years confined with the chickens, Sujit had learned to act like them and mimic their behavior. He was eventually adopted by an Elizabeth Clayton, and learned to speak and act human again.

Stories of feral humans have continued right up into surprisingly modern times. One well-known fairly recent case is that of John Ssebunya, also known as “The Monkey Boy of Uganda.” At the age of 3 or 4, John ran away from home in Uganda in 1988 after witnessing the horrific sight of his alcoholic father murdering his mother. He ran and ran out into the jungle and disappeared. 3 years later, in 1991, he was found living amongst a group of vervet monkeys by a woman collecting firewood. Like many other cases of feral children, John displayed the same sort of behavior as the animals he had been living with, in this case walking about like a monkey, climbing like one, eating only food that the monkeys had been eating, such as roots, nuts, sweet potatoes and cassava, and generally avoiding human contact at first. He also was unable to speak, instead chattering like a monkey. His knees and elbows were heavily calloused, his nails had grown long and curved, and he was very hairy. After being put in an orphanage for while, a family eventually took John in and he would learn to adapt to his new environment. Unlike many feral children, John was ultimately able to learn human language again, and indeed he has become a rather good singer, often touring with The Pearl of Africa Children’s Choir.

In 1991, 8-year old Oxana Malaya was found living amongst dogs in Ukraine. As a toddler, Oxana had been confined to a dog kennel behind the home of her neglectful and abusive alcoholic parents. For 6 years Oxana lived amongst the dogs, with little to no human contact at all. As a result, she was raised more like a dog than a human, and picked up many of their habits and mannerisms. When she was found, Oxana walked about on all fours and panted with her tongue hanging out just like a dog. She would also bare her teeth and snarl or bark when approached. Oxana could not speak other than the words “yes” and “no,” but rather mostly barked like a dog, she sniffed at food and ate it without using her hands, and she crouched and slept in the manner of dogs. She is also claimed to have developed extremely acute senses of smell, sight, and hearing. Oxana eventually learned to speak and walk upright again, but she remains mentally impaired. She currently lives at a mental institute in Odessa, and is charged with taking care of the facility’s farm animals.

Similar to Oxana’s case is that of a young girl simply known as Madina. Found by social workers in Russia in 2013, she was 3-years-old and had apparently spent her whole life being raised by a pack of feral dogs. When she was found she was naked, nimbly ran about on all fours, and barked and snarled at her “rescuers.” She knew only two words, “yes” and “no.” When Madina was examined she was found to be physically healthy, she has quickly picked up language, and is thought to be young enough that she can adjust to normal life again.

Not all feral humans are children. In 2007, a naked, grown woman of around 27 years of age was caught as she tried to steal food in a remote village in Ratanakiri province of Cambodia. It was later found that she was the daughter of the village policeman, who recognized her from a distinctive scar on her arm. The girl, named Rochom P’ngien, had gone missing in 1988, when she was 8-years old, along with her 6-year-old sister as they had been out tending to their water buffalo. They had been presumed to have gotten lost in the jungle and died.

Rochom was practically unrecognizable when she was found. She was filthy, covered with scars, and her hair was a matted mess. She did not walk upright, but rather on all fours, and she was unable to speak, instead resorting to gestures and grunts, able only to say the words “father”, “mother” and “stomachache.” Over the next several years she learned to bathe and dress herself, but she was quick to tear off her clothes at a moment’s notice as well, and was very picky about what she ate, to the point that she was at least once hospitalized for malnutrition. The mysterious woman eventually learned to speak some words of her language, as well as some social skills such as eye contact and smiling, but never gained more than a rudimentary communication ability. Rather than eat and sleep within her family’s home, she took to living in a chicken coop on the property and mostly ate alone. Rochom was also prone to constantly running away from home, sometimes for up to a month at a time, before finally returning from the jungle. In one such incident in 2010, she reportedly vanished for 11 days, after which she was found at the bottom of a 10 meter deep latrine in the jungle.

In later years there has been some doubt cast onto the woman’s true identity. In July of 2016, a man named Pel was traveling through the area from Vietnam’s Gai Lai province and claimed that the woman was his long missing daughter, who he said was named Tak and had vanished from her village in 2006. After 2 weeks of analyzing various documentation and testimony provided by Pel in an effort to prove she was indeed his daughter, authorities eventually granted them custody of the mysterious woman and she was brought back to Vietnam. What exactly happened to the “Cambodian Jungle Girl” in the years between when she disappeared and when she was found remains a mystery.

There are far more tales of feral humans than I have been able to cover here, a startling amount of them in fact. There sees to be a certain innate attraction we have to these stories and they have become so intertwined with exaggeration, myth, and legend that it is often difficult to tell where fantasy ends and reality begins. However, such accounts are always fascinating and hold deep mysteries and potential insight into human nature. What do these mysterious people tell us about the human condition? Are we made human by our surroundings, culture, and language? Are we civilized and tamed by society, yet harbor within us a more animalistic side that lurks beneath the surface, pulsing underneath the veneer of civilization? It seems that with these accounts of feral people we get a glimpse into that animal side, to see a raw, wild aspect to our nature that most of us will never experience, and perhaps this peek into this animal nature can give us insight on just what it means to be human.


An unfortunate Nobel Prize turned another speculation into one scientific fact

The virology, which previously and thoroughly refuted itself, learned only by chance, indirectly and in an extremely unscientific way a rebirth: through the Awarding of the Nobel Prize in Medicine 1954 for a six years ago Observation within the old "A virus is a poisonous protein" virology! This The scientifically unjustifiable Nobel Prize has already been refuted Believe that there are viruses, could revive. This by getting through the Nobel Prize automatically one from the Nobel Prize recipient six months earlier, on June 1st, 1954 published completely different speculation to the never questioned "scientific Fact ”and this alleged fact became the basis of future virology. John Franklin Enders speculated on June 1, 1954 that it might be possible to get viruses multiply, even if you don't even know them.

Enders and his colleague believed that the death of animal tissues in the Test tube interpreted as evidence of the presence and multiplication of viruses could be. After the Nobel Prize, "the whole world" believed this and assumed that it is a scientifically proven fact. In the Publication of their speculation of 1.6.1954 xvii have Enders and his colleague explicitly and repeatedly pointed out that this dying of animal Tissues in the test tube probably have nothing to do with the processes in humans do and that unknown factors or viruses hide the death of the animal Tissue could cause. For they observed that the tissues also died even if no supposedly infected materials were placed on them by sick people. Therefore, they urged readers to keep these observations strictly and with in the future review scientific methods.

After the Nobel Prize on December 10th, 1954, these warnings were dated June 1st, 1954 forgotten and Enders claimed a little later that all future vaccine development based on exactly this speculation of 1.6.1954 would happen. That's how it actually comes to this day. A Nobel Prize made a refuted one Speculation a scientific fact that led to the the resurgence of the disproved Virology led, and as a direct consequence to Corona.

That the speculation of June 1st, 1954 and the rebirth of virology are possible at all was because Enders and all virologists to date are those in science have not carried out the prescribed control tests. That's why they have and All virologists to this day overlook the fact that they are unintentionally and unnoticed the Kill tissue / cells in the test tube. By starving and poisoning with certain Antibiotics that are not only bacteria but also human, animal and vegetable Kill tissue. The ruling in the measles virus trial has become final The entire virology is deprived of its scientific and thus its legal basis. It was established at all three levels of the court and confirmed that the publication von Enders of 1.6.1954 does not contain any evidence of the existence of a virus.

i See the work on this by the association www.libertas-sanitas.de

ii Ursula Stoll and Stefan Lanka: Corona. Further into chaos or opportunity for everyone. Book, 210 pages. 2nd edition, 2021. Available from wplus-verlag.de or http://www.praxis-neue-medizin-verlag.de/

iii Introduction to a New Perspective on Life Part I to III. To be found in the Issues No. 1, 2 and 3/2019 of the magazine WissenschektivenPlus. Available from wplus- verlag.de

iv Quote from Plato (427 - around 348 BC), Latin Plato, Greek philosopher, student of Socrates: “Just as one shouldn't undertake to heal the eyes without the head, yet the head without the whole body, so also not the body without the soul but this one this would also be the reason why doctors do most of the Hellenes Diseases would not have grown because they misunderstood the whole thing which one ought to take care of, and when it is ill, it is impossible any part might be well. Because everything [. ] springs from the soul, evil and good to the body and to the whole person, and flow from there to him as from the head to the eyes. " Source: Plato, Charmides, created around 380 BC. Chr. 156e. Translated by Friedrich Schleiermacher

v The physician and author Seamus O’Mahony, whose book, published in 2019 "Can Medicine Be Cured? The Corruption of a Profession “(translated: Can medicine be cured? The corruption of a profession) comes to a grim prognosis: out In his view, only a humanitarian catastrophe can bring about a reform of medicine.

vi Giuliana Lüssi: Universal biology - a way of life. Book, 180 pages. 2. Edition, 2021. Available from wplus-verlag.de or praxis-neue-medizin.de

vii Stefan Lanka: The perpetrators of the Corona crisis are clearly identified. Virologists who claim disease-causing viruses are science frauds and prosecute. Article in the magazine WissenschektivenPlus 4/2020. To acquire via wplus-verlag.de This post is free on the internet at wissenschektivenplus.de, here at Find “Important Texts”.

viii A new coronavirus associated with human respiratory disease in China. Nature 579: 265-269 (2020). https://doi.org/10.1038/s41586-020-2008-3. In this publication is the first conceptual / computational construction of the genetic strand (= genome) of SARS-CoV-2 described. This calculated genome became the exclusive template for everyone subsequent constructions, in which only what was repeated with this work was specified. This work did not appear online until 3.2.2020, which gives the impression arouses that the second work on this (see below), which was published earlier apparently originated independently of each other. This is not the case: the in of this first work mentally / calculated gene sequence was previously on 10.1.2020 published on the Internet and thus become a model and template of what SARS-CoV-2 should be should.

The second scientific publication on the new corona virus, with which the conceptual / computational construction of the viral genome confirmed in the first work and was supplemented by electron microscope images of the "virus": A Novel Coronavirus from Patients with Pneumonia in China, 2019. N Engl J Med 2020 382: 727- 33. DOI: 10.1056 / NEJ-Moa2001017. Published online in advance on January 24th, 2020.

ix Stefan Lanka: misinterpretation virus. Part 1 (in SciencePlus 1/2020) and Part II (in SciencePlus 2/2020). The contribution "Misinterpretation Virus Part II" is free on the Internet at wissenschaffplus.de, to be found here under “Important texts”.

x See "projekt-immanuel.de" and the Youtube channel "Dean's Danes"

xi See "Corona_Fakten" on the Internet provider Telegram, which is available as an app for mobile devices and PCs there.

xii To get an idea of ​​how HIV / AIDS is still killing today, I recommend the film, freely available on the internet: “I wont go quietly! by Anne Sono who are also gained a positive name for home schooling through her commitment and films has made.

xiii Karlheinz Lüdtke: On the history of early virus research. How to deal with technical advances in the study of "filterable" infectious agents Understanding of the nature of viruses. Reprint No. 125 (1999) of the Max Planck Institute for the History of Science, 89 pages.

xiv When attempting to study human DNA, only these came up Refutations of all models of heredity and the role of DNA. Please refer the article "Erbgut in Dissolution" in DIE ZEIT from June 12, 2008, which is free on the Internet can be found. See our numerous articles about "genetic engineering" in the Magazine w + (SciencePlus).



Comments:

  1. Bourne

    I think, that you commit an error. I can defend the position. Write to me in PM.

  2. Boell

    You hit the mark. Excellent thought, agree with you.

  3. Micaiah

    In my opinion, he is wrong. We need to discuss. Write to me in PM, speak.

  4. Monos

    I agree with you, thanks for an explanation. As always all ingenious is simple.



Write a message