Taking Back Our Stolen History


An infectious disease that was allegedly eradicated in 1977 (It was renamed monkeypox, whitepox, and camelpox – clinically identical symptoms that popped up during smallpox vaccine push). The medical establishment continues to credit the smallpox vaccine for eliminating smallpox, but the vaccine was actually a failure. Mortality rates from smallpox increased after compulsory vaccinations from 2.04 per 10,000 individuals in 1850 to 10.24 in 1871. Leicester, England had one of the highest vaccination rates in the vaccinated world and their smallpox breakout was higher than ever. They decided to stop the smallpox vaccine, and once they did they had the lowest rate of smallpox infection and deaths in the world.

Every vaccine has a story behind it, Dr. Suzanne Humphries, author of Dissolving Illusions: Disease, Vaccines, and the Forgotten History, says. The smallpox vaccine, for example, was actually developed long before the medical establishment knew anything about the human immune system. The revelations on smallpox alone are fascinating enough to purchase this book, and is far more detailed than the summary in this article.

The vaccine was actually developed based on a rumor circulating among dairy maids. The rumor was that when a dairy maid had been infected with cowpox—which is a common infection on the udder of the cow—she would no longer be susceptible to smallpox. The rumor was a persistent one, as rumors can be, despite the fact that there were plenty of dairy maids who developed smallpox after having cowpox. But this rumor is what led Edward Jenner to develop the first smallpox vaccine.

“Basically, it was made by scraping pus off the belly of a cow,” Dr. Humphries says. “Sometimes there was some goat genetic disease in there. There was horsepox mixed in there.

There was sometimes human pox mixed in and some glycerin. They would shake it up; they would take kind of a prong, and puncture the skin several times…

What I didn’t realize was that there were many people who developed serious smallpox disease and died after they were vaccinated. The severity of disease was often worse in the vaccinated than the unvaccinated.

There are statistics that show that the death rate was higher in the vaccinated than the unvaccinated.”

When the smallpox vaccine was developed, there was also no way to accurately diagnose the type of pox disease a person had. It may have been chickenpox, monkeypox, or smallpox, but back then, any kind of pox disease was considered smallpox—even though the vaccine didn’t actually have the human smallpox virus in it. Animal pox virus was always used. According to Dr. Humphries, it was the most contaminated vaccine that’s ever been on the market.

“If you look at a town like Leicester in England, that town was noticing that they had one of the highest vaccination rates in the vaccinated world and their smallpox breakout was higher than ever,” Dr. Humphries says“The people in the town had a rally. The mayor and some of the health officials were there. They all agreed that they were going to stop vaccinating… The result was quite different from the predictions.

The predictions were that there was going to be a bonfire of disease set upon the planet and that these people in Leicester were risking the health of the world by not making vaccination mandatory. But once they stopped smallpox vaccines they had the lowest rate of smallpox infection and deaths.

What we show in our book – and we show the graphs of the disease rates and the death rates – was that both of them went down precipitously after the vaccinations were stopped. That story right there tells you that vaccines were not what made the disease go away; what made the disease go away was isolation and sanitation.”

The history of vaccination begins with attempts to reduce the number of people who died as a result of smallpox infection (3, 8).  The process of inducing immunity to a disease by exposing a non-immune person to the virus that causes it began centuries ago and was known as “inoculation” (2).  Inoculation against smallpox involved using a knife, lancet, or scalpel to make a cut in the arm or leg of the patient and then transferring biological matter taken directly from the oozing pustule of an infected person (9).  This process, called arm-to-arm inoculation, resulted in the inoculated person developing a form of the illness, but the course tended to be shorter in duration and milder in symptoms.  Some people died as a result of inoculation, but those who recovered were immune to smallpox for life (2).

As the incidence of smallpox increased in North America during the 1700s, inoculation (or variolation, as the procedure had come to be known) against the virus became more widely used (2).  Two of the most well-known proponents of variolation were Rev. Cotton Mather and Dr. Zabdiel Boylston.  Mather and Boylston performed and promoted the procedure among the citizens of New England, beginning in 1721 (1).  Their activities were well-received by some but many people were suspicious of the practice and believed variolation was as dangerous as contracting smallpox naturally.  Using statistical analysis to compare the death rate among the approximately 6,000 citizens of Boston who contracted smallpox during the 1721 epidemic, Mather and Boylston demonstrated that among those who were variolated the death rate was 2%.  Among those who contracted the naturally-occurring form of smallpox, the death rate was 14%. (2)

The success of Mather and Boylston’s use of variolation in New England led to wider acceptance of the process in Europe, where smallpox had resulted in the deaths of more than a few young members of the ruling class (2).  Variolation grew in popularity and was practiced widely among the European aristocracy during the mid-to-late 18th century.  However, despite the success
and popularity of the procedure, there were well-founded concerns about safety, due to the number of people who developed not only smallpox, but other blood-borne diseases including syphilis and tuberculosis as a result of undergoing variolation (2).

In the late 1700s, a young English physician named Edward Jenner began experimenting with using cowpox virus to inoculate humans against smallpox (8).  Jenner collected data in controlled experiments and wrote articles publishing his findings among his peers in the medical societies of Europe.  Jenner coined the term “vaccination” to refer to his procedure, taking the Latin vacca (cow) and vaccinia (cowpox) as the root (2).

Jenner not only shared his knowledge through published literature, he also shared the vaccine itself, giving samples to other physicians and anyone else who requested it. Use of the cowpox vaccine grew as those who received it from Jenner passed it on to others.  In 1800, Dr. John Haygarth was responsible for the introduction of the vaccine in the United States when he sent some of Jenner’s cowpox vaccine to a physics professor at Harvard University.  After introducing the vaccination in New England, professor Benjamin Waterhouse convinced Thomas Jefferson to try the vaccination in Virginia.  It was this contact between Waterhouse and Jefferson that led to the establishment of the National Vaccine Institute and the implementation of the United States’ first national vaccination program (2).

The first mandatory vaccination law in the United States was enacted in 1809 in Massachusetts, giving the government the power to enforce mandatory vaccination or quarantine in the event of a disease (smallpox) outbreak that posed a threat to the public health (10).  Throughout most of the 1800s vaccination against smallpox in the United States was voluntary, though coercion was often used to convince citizens to receive the procedure.  During the 19th century the widely held belief that those who suffered the ravages of poverty were responsible for their own circumstances due to a lack of moral fortitude contributed to the success of “the vaccinators.”  Outbreaks of smallpox, and other highly contagious diseases such as scarlet fever, measles, and diphtheria almost always originated in populations of impoverished immigrants living in overcrowded conditions characterized by poor sanitation and lack of access to heat, clean water, or nutritious food.  Because illness and poverty were so frequently and closely linked, proponents of vaccination promoted acceptance of the procedure on moral grounds to increase the numbers of those who were willing to accept vaccination against smallpox, even as the incidence of the disease continued to decline in the general population (6).

Two outbreaks of smallpox in New York’s German and Italian immigrant communities around the turn of the century set the stage for the establishment of policies for compulsory vaccination. In response to the increasing number of infections, Brooklyn’s Republican Mayor, Charles Schieren, appointed Dr. Z. Taylor Emery to head the city’s health department.  Emery was charged with increasing vaccination coverage, particularly among the city’s immigrants.  To achieve this goal, Emery was granted the power to mandate vaccination and enforce quarantines for anyone who refused.  The number of health department employees responsible for delivering vaccines (e.g., “vaccinators”) was increased, eventually numbering more than 200, and teams of vaccinators fanned out across the city enforcing the procedure in tenements and apartment buildings that housed mainly poor, immigrant families. Colgrove (6) reports that according to the official “Rules for Vaccinators” quarantines could be ordered and enforced by the Sanitary Police only if there had been a confirmed case of smallpox infection in the vicinity.  However, according to reports published in the Brooklyn Daily Eagle newspaper (cited in Colgrove, 2006), as time went on, it became increasingly common for families to be quarantined without cause, even in the absence of any identified cases of infection.  According to Colgrove’s review, families who were quarantined were denied access to employment or even food deliveries until they eventually acquiesced to the demands of the vaccinators (6).

The city’s homeless population was targeted by the vaccinators, who descended on the rooming houses where the 2,400 homeless members of Brooklyn’s society congregated.  Colgrove (2006) reports on the language of a health department publication, which reveals the attitudes toward those who lived in poverty.  According to Colgrove, the report discussed the importance of targeting Brooklyn’s 72 lodging houses, stating, “In them are gathered nightly a large proportion of those homeless and vagrant ones in our population whose unwholesome heredity and unsanitary lives render them liable not only to the commission of crimes, but to the contraction of disease” (Colgrove, 2006, p. 25).

To ensure the highest possible rate of vaccination among the city’s homeless population, the proprietors of the lodging houses were forced to require their tenants to produce proof of vaccination status as a condition of receiving a room for the night. Proprietors who failed to comply with the mandate were threatened with loss of licensure (6).

As the aggressiveness of Emery’s teams of vaccinators continued to increase, they began targeting businesses and enforced mandatory vaccination on employees who were threatened with losing  their jobs if they did not comply.  In his review of the social context of Brooklyn in the 1890’s, Colgrove (2006) reports that the fact that the city was in the middle of a severe depression most likely had a significant impact on the willingness of employees to accept vaccination rather than risk unemployment.  Colgrove also notes that the effects of the depression most likely contributed to the zeal with which the vaccinators pursued their prey, as the administrators of the smallpox vaccine were paid thirty cents for each person they vaccinated.

As public complaints against the vaccinators increased, one letter published in The Daily Eagle newspaper charged that paying the vaccinators in this manner “created an incentive for them to ‘terrorize or intimidate healthy people to be revaccinated by them under penalty of quarantine for refusal’” (Colgrove, 2006, p 23).

During the smallpox outbreak of  1893-1894 resistance against compulsory vaccination increased and several legal challenges were filed by the Anti-Vaccination League, which was a grass-roots organization based in Brooklyn and comprised largely of homeopathic physicians.  Throughout the mid-to-late 1890s, the Anti-Vaccination League filed several lawsuits against the local government and Emery, asserting that the actions of the vaccinators were infringing on the Constitutional rights of United States citizens.  In addition, there were frequent allegations that the vaccinators falsified death records in an effort to cover up the fact that their zealous activities were resulting in death for a percentage of those citizens they were charged with protecting (6).

Despite multiple challenges from the Anti-Vaccination League during the 1890s in Brooklyn, the conflict leading to the seminal legal ruling regarding compulsory vaccination did not begin until the winter of 1902 and it took place in Boston, Massachusetts. The case of Jacobson v. Massachusetts involved an adult, Henning Jacobson, who refused smallpox vaccination and also refused to pay the $5.00 fine imposed for not complying with the health board’s order of compulsory vaccination.  At trial, Jacobson presented evidence that vaccination was dangerous and frequently caused serious injury or death, and that he himself had been harmed by vaccination as a child.  The case went all the way to the Supreme Court, which ultimately ruled that The State (in this case, Massachusetts) was not unreasonable and had not violated Jacobson’s Constitutional rights because in the midst of the smallpox epidemic, laws enacted by the state had a “real and substantial relation to the protection of the public health and safety” (10; Welborn, 2005, p. 1-2).

The ruling of the Supreme Court affirmed the sovereignty of individual states to enact and enforce laws to protect the public health and safety, with the only provision being that in doing so, state laws did not violate the United States Constitution or infringe on rights granted by it (10).

Many states had laws on the books regarding compulsory vaccination against smallpox as a prerequisite for school enrollment, beginning in the early 19th century (11).  The first recorded mandate in the U.S. was in 1827 when smallpox vaccination became a requirement for entry into public school (12).  However, the laws were not widely enforced or challenged until the smallpox epidemic of 1893-1894 (6).  In 1894 a lawsuit was filed against the principal of a public school in Brooklyn, seeking admittance of the two children of a physician, Charles Walters. Dr. Walters was involved with the Anti-Vaccination League and his children had not been vaccinated against smallpox (6).

Despite the fact that court rulings had generally been decided in favor of the civil rights of adults who had challenged mandatory vaccination (6), the earliest cases involving school attendance were decided in favor of the state and local authorities. In the case of Walters v. Public School No. 22, The Court ruled that compulsory vaccination could be enforced through city and state laws as long as they did not violate the U.S. Constitution in doing so.  The basis for the judge’s decision was that attendance in a public school was a privilege and not a right.  Colgrove reports the findings of the case, quoting the judge’s ruling, “A common school education, under the existing constitution of the State of New York, is a privilege rather than a right… It follows that the State can certainly exercise this discretion by debarring from attendance at the public schools such persons as are unwilling to adopt a precaution which, in the judgment of the legislature, is essential to the preservation of the health of the large body of scholars” (6; Colgrove, 2006, p. 28-29).

The question of whether mandatory vaccination against smallpox as a condition of attendance in public school violated Constitutional rights was heard again by the Supreme Court in 1922,  and again The Court ruled that mandatory vaccination was legally enforceable, basing its decision on precedent established by the Jacobson v. Massachusetts case of 1905 (11).

The Expansion of the Childhood Vaccination Schedule

During the first half of the twentieth century there was a great expansion in vaccine research, leading to the development of new vaccines for pertussis, diphtheria, and tetanus in 1902, 1926, and 1938, respectively (9).  The polio vaccine was licensed in 1955,
followed by the development of the measles, mumps and rubella vaccines in the late 1960s (9).  Although smallpox vaccination had been a requirement of school attendance for decades, it was during the late 1930s that compulsory vaccination against other illnesses began to be instituted for children enrolling in public school (11).  By 1942, nine states had adopted laws requiring immunization against diphtheria, in addition to smallpox vaccination for school children (11).  It was not until the 1980s that laws regarding vaccination of children in public school were expanded to include more than one or two vaccines.  Many of the laws concerning mandatory vaccination of school children sprang up as a result of measles outbreaks in the 1960s and 1970s (10).

Antibody Is the Wrong Way to Ascertain Immunity

One of the major arguments against vaccine-induced immunity is that it primarily stimulates the humoral immune system and not the cellular immune system. Antibodies are produced by the humoral immune system and then routinely measured to determine “immunity.” The problem with this approach is that you can have high antibody levels and still get the disease. It’s very difficult and expensive to measure the cellular immune response, and immunologists admit that they are still in the dark about a lot of the finer points of the overall immune response.

When you use antibody titers or blood levels  to check for immunity, all you’re doing is getting a picture of what happened (you had an immune response); it doesn’t tell you whether you’re going to be immune in the future, because antibodies are only one aspect of the immune response, and in some cases are not even necessary to easily combat the sickness and become immune.

For example, those with agammaglobulinemia—a disease where you cannot make antibodies — can get infected with measles, recover uneventfully, and still respond to subsequent challenges of the virus in a normal healthy fashion and not get sick. These individuals will have lifelong immunity to measles, the same as someone without agammaglobulinemia.

Traditionally, the way immunity is determined is to do a test that measures antibodies, which is the humoral immune system. But there’s no good way to assess the cellular immune system. It’s a really imprecise science at best. As Dr. Humphries notes:

“It’s not only imprecise; sometimes it’s downright inaccurate. You can have very high antibody levels, like numerous case reports of people who have hugely high antibody levels for tetanus, or normal antibodies, and have gotten some of the worst cases of tetanus. I have papers that show that people without antibody for polio have actually been able to respond to the virus as if they were already immune. The antibody really is a real wrong roadmap to look at to tell what’s really going on. Sometimes there’s correlation, but it’s certainly not a given.”

Non-Pharmaceutical Interventions & Medical Tyranny

The first mandatory vaccination law in the US was enacted in 1809 in Massachusetts, giving the government the power to enforce mandatory vaccination or quarantine. In Brooklyn, NY the Mayor mandated vaccines and quarantines and payed “vaccinators” ($0.30/jab or ~$7 today) to go door-to-door in immigrant apartment neighborhoods to force vaccinate or quarantine them (many without cause). Coercion was used in Brooklyn, Massachusetts, and elsewhere in the US to convince people to get the smallpox vaccine.

According to Colgrove’s review, families who were quarantined were denied access to employment or even food deliveries until they eventually acquiesced to the demands of the vaccinators. Apartment owners were forced to require their tenants to produce proof of vaccination status as a condition of receiving a room. The fact that the city was in the middle of a severe depression had a significant impact on the willingness of employees to accept vaccination rather than risk unemployment as well as the aggressiveness of the vaccinators who often terrorized or double vaxxed their prey.

During the smallpox outbreak of  1893-1894 resistance against compulsory vaccination increased and several legal challenges were filed by the Anti-Vaccination League, which was a grass-roots organization based in Brooklyn and comprised largely of homeopathic physicians, for violations of Constitutional rights. In addition, there were frequent allegations that the vaccinators falsified death records in an effort to cover up the fact that their zealous activities were resulting in death for a percentage of those citizens they were charged with protecting.

Smallpox Vaccine in Africa in the 1980’s Caused AIDS Pandemic

On May 11, 1987, The London Times, one of the world’s most respected newspapers, published an explosive article entitled, “Smallpox vaccine triggered AIDS virus.”

The story suggested the smallpox eradication vaccine program sponsored by the WHO (World Health Organization) was responsible for unleashing AIDS in Africa. Almost 100 million Africans living in central Africa were inoculated by the WHO (World Health Organization). The vaccine was held responsible for awakening a “dormant” AIDS virus infection on the continent.

An advisor to the WHO admitted, “Now I believe the smallpox vaccine theory is the explanation for the explosion of AIDS.”

Robert Gallo, M,D., the co-discoverer of HIV, told The Times, “The link between the WHO program and the epidemic is an interesting and important hypothesis.

I cannot say that it actually happened, but I have been saying for some years that the use of live vaccines such as that used for smallpox can activate a dormant infection such as HIV.” Despite the tremendous importance of this story, the U.S. media was totally silent on the report, and Gallo never spoke of it again.

In September 1987, at a conference sponsored by the National Health Federation in Monrovia, California, William Campbell Douglass, M.D., bluntly blamed the WHO for murdering Africa with the AIDS virus.

In a widely circulated reprint of his talk entitled “W.H.O. Murdered Africa” , he accused the organization of encouraging virologists and molecular biologists to work with deadly animal viruses in an attempt to make an immunosuppressive hybrid virus that would be deadly to humans.

From the Bulletin of the World Health Organization (Volume 47, p.259, 1972), he quoted a passage that stated: “An attempt should be made to see if viruses can in fact exert selective effects on immune function. The possibility should be looked into that the immune response to the virus itself may be impaired if the infecting virus damages, more or less selectively, the cell responding to the virus.”

According to Douglass, “That’s AIDS. What the WHO is saying in plain English is Let’s cook up a virus that selectively destroys the T-cell system of man, an acquired immune deficiency.'” The entire article can be read on google.com (“WHO Murdered Africa”).

In his 1989 book, ‘AIDS: The End of Civilization,’ Douglass claims the WHO laced the African vaccines. He blames “the virologists of the world, the sorcerers who brought us this ghastly plague, and have formed a united front in denying that the virus was laboratory-made from known, lethal animal viruses. The scientific party line is that a monkey in Africa with AIDS bit a native on the butt. The native then went to town and gave it to a prostitute who gave it to a local banker who gave it to his wife and three girl friends, and wham – 75 million people became infected with AIDS in Africa. An entirely preposterous story.