The Arrow #219

Hello everyone.

Greetings from Montecito, where it has been pouring rain and cold for the last couple of days. Today it has been alternating between rain and a kind of watery sunshine. It seems I can’t find good weather anywhere I go lately.

I’ve got to kick off today’s Arrow with a comment I got from a reader in Israel. It is about the graphic below I displayed last week.

The red columns on top represent deaths from measles in those who have been vaccinated against measles. The blue columns on the bottom are those who died from measles who were unvaccinated.

I commented that "There are about 13 times more kids who are vaccinated than there are those who aren’t. If you apply that to the stats above, it makes the deaths in red much lower than those in blue (non-vaccinated)."

I just kind of did this via a quick mental calculation. I should have been more careful and done it correctly.

As my reader pointed out:

There were 144 suspected vaccine-related deaths reported in total in the period in question, compared to 9 measles-associated deaths in the unvaccinated. That's exactly 16 times more. If the population of vaccinated kids is 13 times the size, that would cut the difference down greatly, but it would still leave the 'red deaths' (vaccinated) 23% higher per capita than the measles-associated 'blue deaths' (unvaccinated).

Did I miss something here?

No, he did not miss something. I did. He is correct.

Based on the figures I found on the number of US kids vaccinated versus unvaccinated, it turns out that a bit more than 13 times more kids were vaccinated than unvaccinated. When you apply that difference to the numbers shown in the graphic above (144 deaths in the vaccinated vs 9 deaths in the unvaccinated, it turns out that were 23 percent more deaths per capita in the vaccinated group than in the unvaccinated. Just as my reader calculated.

I made mention last week that VAERS is a voluntary reporting system. Those who are on board with the vaccines always retort when VAERS is mentioned that these are not officially figures, the deaths aren’t validated, and the numbers aren’t to be believed.

Those who are on the vaccine skeptical side of the argument point out that only 1-2 percent of vaccine injuries or deaths are reported to VAERS. One can only imagine how many vaccine deaths may have occurred that weren’t reported. I, myself, know of two for sure.

I apologize for doing such a sloppy job last week on analyzing these data by eyeball. I’m glad my preceptive reader called me out on it.

Since I’ve spent so much time studying vaccines over the past four or so years, my bias is against them until they are tested properly. So in doing my eyeball estimate last week, I actually went against my own bias. I’m glad I was wrong.

While we’re on the subject of vaccines, I have delved into every source I can find in Texas trying to discover the health status of the little girl who they say died of measles. I’ve come up empty handed everywhere I’ve looked.

Then I found an X post. I’m reluctant to put it up, because I have no way of verifying if what the woman says is true or not. But it makes sense. Since I’ve had measles myself and had siblings and countless friends who had the measles, I know what an innocuous infection it is to normal kids. I highly suspect there were other issues with the child who died in Texas, because normal, healthy kids don’t usually die from measles.

I’m throwing this X tweet up for you to watch. Maybe someone will know the woman on it and be able to confirm. But what she says makes perfect sense given what I know about the disease.

The person who posted this has an obvious anti-vax bias based on the rest of his/her posts. I don’t know if the lady in the video is known to the poster or not. I went through all the comments looking for anything that might provide more info, but alas found nothing. If anyone knows this lady, or knows where this video was originally posted, please let me know.

Doctor Apologizes to His Patients For Vaccinating Without Informed Consent

Dr. Daniel Neides, formerly of the prestigious Cleveland Clinic, makes an emotional apology in the video below to his patients for subjecting them to vaccines without providing them with informed consent beforehand. Not only did he not explain potential issues that might arise post vaccine, he didn’t know what those issues might be.

Why?

Because he was never taught them in medical school. All he was taught regarding vaccines was the vaccination schedule. I had the same experience, except I wasn’t even taught the vaccination schedule. I attended medical school in the late 1970s (I graduated in 1979, which was well before the 1986 National Childhood Vaccine Injury Act (NCVIA) that removed all liability from vaccine manufacturers for damages caused by their product. ) When I graduated, small pox had been eliminated, so there was no longer a vaccine. The vaccine schedule was the MMR, the DpT, and polio. That was it.

Once liability was removed by the NCVIA, the number of vaccines exploded.

Dr. Neides was about ten years behind me, so when he was in med school, there were many more vaccines than when I was a med student. But he wasn’t taught anything about the vaccines other than the schedule. I wasn’t taught about them, either. It was just a given the ‘truth’ that they were wonderful.

There is a major course in med school called pharmacology, which is the study of drugs. As I recall, it was a year-long course that ran through the second year. In the course, we learned about all the different classes of drugs, what they were used for, how to dose them, and what the side effects were. Vaccines were never mentioned.

I’m sure it was the same with Dr. Neides.

After his residency, Dr. Neides went to work for the Cleveland Clinic and ended up being the COO and the medical director of the Cleveland Clinic Wellness Institute. His motivation was to keep his patients healthy, and somewhere along the way he realized that just maybe vaccinations were problematic. He ended up writing an op-ed piece for Cleveland News site about wellness in 2017 that included these paragraphs:

Slight detour. Why do I mention autism now twice in this article. Because we have to wake up out of our trance and stop following bad advice. Does the vaccine burden - as has been debated for years - cause autism? I don't know and will not debate that here. What I will stand up and scream is that newborns without intact immune systems and detoxification systems are being over-burdened with PRESERVATIVES AND ADJUVANTS IN THE VACCINES.

The adjuvants, like aluminum - used to stimulate the immune system to create antibodies - can be incredibly harmful to the developing nervous system. Some of the vaccines have helped reduce the incidence of childhood communicable diseases, like meningitis and pneumonia. That is great news. But not at the expense of neurologic diseases like autism and ADHD increasing at alarming rates.

When I was in medical school in the late 1980s, the rate of autism was 1 in 1,000 children. For those born in the 1950's and 60's, do you recall a single student in your grade with an Individualized Education Program (IEP) for ADHD or someone with a diagnosis of autism? I do not.

As of 2010, the rate of autism in the U.S. escalated to 1 in 68 children. The deniers will simply state that we do a better job of diagnosing this "disorder". Really? Something (s) are over-burdening our ability to detoxify, and that is when the problems begin.

So let me be clear - vaccines can be helpful when used properly. But the vaccination timing and understanding one's epigenetics (how your genes interact with the environment) are all critical to our risk of developing chronic disease. Please talk to your doctor about the optimal timing of vaccinations for your children, and therefore reduce your risk of raising a child with a neurologic complication.

For those who want to dive in further, help me understand why we vaccinate newborns for hepatitis B - a sexually transmitted disease. Any exposure to this virus is unlikely to happen before our second decade of life, but we expose our precious newborns to toxic aluminum (an adjuvant in the vaccine) at one day of life.

And when they actually need the protection, many who have received this three-shot series in the first year of life will lack antibody protection--as immunity may not last. Perhaps delaying the series until the immune system is more mature would reduce the risk of neurologic complications.

As you might imagine, this did not set well with his masters. In short order the gears began to turn in the Cleveland Clinic grinding him out of his long-held position. Here is his telling of what happened:

On Jan. 13, one week after the article came out, I was relieved of my administrative duties, after being a loyal soldier for 20 years. I was allowed to continue to practice clinically, but essentially my career at the Cleveland Clinic was over," Neides told the group. "In September, they dissolved my clinical area within the Wellness Institute, essentially letting me know that at the end of the year I had to find other work.

As my redneck friends would say, It just don’t do to scorn vaccines.

For the life of me, I can’t understand the hallowed position vaccines hold in the minds of the medical profession. There is not a drug out there—including the lowly aspirin—that doesn’t cause side effects of one kind or another in some people. Vaccines are no different. It’s okay to discuss the side effects of every other drug, but mention vaccine side effects, and you’re in big trouble.

It shows the massive snow job Big Pharma has laid on the medical profession. If you’re in mainstream medicine and you criticize vaccines in any way, you risk your position.

Here is the video of Dr. Neides apologizing to his patients.

This cannabis startup pioneered “rapid onset” gummies

Most people prefer to smoke cannabis but that isn’t an option if you’re at work or in public.

That’s why we were so excited when we found out about Mood’s new Rapid Onset THC Gummies. They can take effect in as little as 5 minutes without the need for a lighter, lingering smells or any coughing.

Nobody will ever know you’re enjoying some THC.

We recommend you try them out because they offer a 100% money-back guarantee. And for a limited time, you can receive 20% off with code FIRST20.

RFK, Jr. Meets With Big Food

Bobby Kennedy had his first meeting with the CEOs of Big Food a couple of days ago and nothing much happened. According to both the NY Times and the Wall Street Journal, Kennedy’s big ask was to get rid of all the dyes in foods kids eat. No where was there a mention of seed oils and/or ultra-processed foods.

According to the NY Times

In his first meeting with top executives from PepsiCo, W.K. Kellogg, General Mills and other large companies, Robert F. Kennedy Jr., the health secretary, bluntly told them that a top priority would be eliminating artificial dyes from the nation’s food supply.

At the Monday meeting, Mr. Kennedy emphasized that it was a “strong desire and urgent priority” of the new Trump administration to rid the food system of artificial colorings.

In addition, he warned the companies that they should anticipate significant change as a result of his quest for “getting the worst ingredients out” of food, according to a letter from the Consumer Brands Association, a trade group. The Times reviewed a copy that was sent to the group’s members after the meeting.

And while Mr. Kennedy said in the meeting that he wanted to work with the industry, he also “made clear his intention to take action unless the industry is willing to be proactive with solutions,” the association wrote. [My bold]

It’s strange to me that dyes would be the first thing Kennedy leads with. He says it is a “strong desire and urgent priority” of the Trump administration to remove artificial coloring from food.

I’m not a fan of food dyes, but this is the first I’ve heard of its being an “urgent priority.” I guess you’ve got to start somewhere, but given Kennedy’s antipathy toward seed oils, I figured that would be the first thing he would bring up. Apparently, it wasn’t mentioned.

Later in the day, after the meeting, Kennedy as HHS Secretary released a directive to the FDA instructing that agency

to revise a longstanding policy that allowed companies — independent of any regulatory review — to decide that a new ingredient in the food supply was safe. Put in place decades ago, the policy was aimed at ingredients like vinegar or salt that are widely considered to be well-understood, and benign. But the designation, known as GRAS, or “generally recognized as safe,” has since grown to include a far broader array of natural and synthetic additives.

According to the press release from the HHS:

Today, as part of this commitment, HHS Secretary Robert F. Kennedy Jr. is directing the acting FDA commissioner to take steps to explore potential rulemaking to revise its Substances Generally Recognized as Safe (GRAS) Final Rule and related guidance to eliminate the self-affirmed GRAS pathway. This will enhance the FDA’s oversight of ingredients considered to be GRAS and bring transparency to American consumers.

“For far too long, ingredient manufacturers and sponsors have exploited a loophole that has allowed new ingredients and chemicals, often with unknown safety data, to be introduced into the U.S. food supply without notification to the FDA or the public,” said Secretary Kennedy. “Eliminating this loophole will provide transparency to consumers, help get our nation’s food supply back on track by ensuring that ingredients being introduced into foods are safe, and ultimately Make America Healthy Again.”

Currently, the FDA strongly encourages manufacturers to submit GRAS notices through the agency’s GRAS Notification Program, but industry can self-affirm that the use of a substance is GRAS without notifying the FDA. The FDA has completed and published more than 1,000 GRAS notices and evaluates an average of 75 notices per year. The agency maintains a public inventory where all GRAS notices that have been filed by the agency, along with the supporting data, and FDA’s final agency response letters are available for review and download by the public.

Eliminating the self-affirmation process would require companies seeking to introduce new ingredients in foods to publicly notify the FDA of their intended use of such ingredients, along with underlying safety data, before they are introduced in the food supply. [My bold]

As the NY Times article mentioned, additives to foods such as salt, garlic, and other spices frequently used in cooking were thought to be benign, so companies could simply list them as GRAS, Generally Recognized as Safe without going through an approval process.

The old “give them an inch, and they’ll take a mile” dictum took hold, and Big Food began classifying all kinds of chemicals as GRAS bypassing any regulatory agency. Which is why Europe has so many fewer food additives than we do in the US. In Europe these additives have to be passed before they can be added to the foods. Not so much here.

I would suspect that all the honchos from Big Food figured they were in pretty good shape after the meeting during which Kennedy focused mainly on artificial dyes. Then when they got back to their offices, they second shoe drops.

My guess is that the FDA will start requesting a number of formerly declared GRAS additives that will ultimately be found to be not so GRAS be banned from the food supply.

At least that is my hope.

If you would like to support my work, take out a premium subscription (just $6 per month).

Is Butter Bad For You?

First, I hope not, because I consume a fair amount of it.

The title of this section comes from an article with the same title published in The Economist a week or so ago.

I’m always surprised when these terrible studies are foisted off on the world by health journalists, who should know better. Any health journalist worth his/her salt should smell the BS in these kinds of studies from a mile away.

I’m also always surprised when these kinds of studies get funded by the NIH, because they are less than worthless. As I’ll demonstrate, they are good for nothing other than maybe putting those who don’t understand the statistics involved off of butter for a week or so. I would think the NIH would have a better vetting process for studies and would refuse to fund something like this one.

When I saw the article in The Economist, I knew I was going to get asked about it, and, sure enough, I got a handful of emails wondering if butter really was bad for us.

The paper titled “Butter and Plant-Based Oils Intake and Mortality” was published in JAMA Internal Medicine a week ago. Not only was this POS written up in The Economist, it was reviewed in an editorial in JAMA.

It is behind a JAMA paywall, but I don’t need to see the whole thing to know how crappy it is. Just reading the abstract tells me everything I need to know.

The study—if you want to call it that—is a cohort study, which is a type of observational study involving following a group of individuals who share a common characteristic or experience over a period of time. The purpose is to examine how certain factors, such as exposure to a risk factor, affect health outcomes or other events of interest.

There are a lot of such cohorts around. The three used for this study are “the Nurses’ Health Study (1990-2023), the Nurses’ Health Study II (1991-2023), and the Health Professionals Follow-up Study (1990-2023).” These cohorts are groups of people who volunteered to be studies over a long period of time. They were in good health going into the cohort, then have follow up visits periodically when all kinds of health questions are asked of them. The databases of these cohorts can be, and often are, used as the databases for many studies.

The subjects making up the cohorts are usually given various questionnaires a couple of times a year. In this case, these cohorts were given dietary questionnaires twice a year asking them to estimate how much of whatever kind of food they had eaten over the previous six months.

Let me ask you. On a daily basis, how much butter did you consume over the past six months? How many eggs? How many apples?

I’m trying to keep a food diary to go along with my ketogenic diet experiment, and if I don’t write it down as I do it, I always forget something if I try to reconstruct my diet the next day. And my ketogenic diet is a fairly simple one. If I ate everything that wasn’t red hot or nailed down, I would never be able to come up with an accurate accounting of my diet from the day before or the week before.

The folks in these cohorts are doing it for the previous six months.

When all the adjustments were made on these cohorts to make them fit the specs for the study (for example, the authors eliminated all those who had health issues when they signed up as part of the cohort) they ended up with 33 years of follow-up among 221, 054 adults.

They analyzed the database for butter intake and consumable oil intake, and concluded that those who ate the most butter had a 15 percent higher risk of dying.

Participants were categorized into quartiles based on their butter or plant-based oil intake. After adjusting for potential confounders, the highest butter intake was associated with a 15% higher risk of total mortality compared to the lowest intake (hazard ratio [HR], 1.15; 95% CI, 1.08-1.22; P for trend < .001). In contrast, the highest intake of total plant-based oils compared to the lowest intake was associated with a 16% lower total mortality (HR, 0.84; 95% CI, 0.79-0.90; P for trend < .001). [My bold]

They adjusted for potential confounders, which are arbitrary choices. You’ve got a situation where you’re relying on food questionnaire data compiled by trying to recall foods eaten in the previous six months. And you fiddle with the data to “adjust for potential confounders.” Then, after all that, the difference isn’t all that much. It’s certainly not enough to recommend people not to eat butter.

Plus, there is a healthy user bias to contend with. Despite the fact as I’ve belabored countless times that even the cardiologists have given saturated fat a pass as not being problematic, most everyone still thinks it is a health issue. So people in these cohorts who are trying to eat ‘healthfully’ will probably avoid saturated fats, such as butter. Or at least avoid reporting that they used it. And they will probably use more oils in their cooking or say they did.

But healthy people are healthy in a lot of ways. They don’t smoke as much. They exercise more. They watch their weight. They are more particular in what they eat. If they erroneously believe that saturated fat will cause their cholesterol levels to rise and potentially cause them to experience heart disease, they will probably avoid it.

Since healthy people tend to live longer, then I would suspect those who consumed less butter would live longer, not because they avoided butter, but because they are living and being healthier overall.

I’m assuming the writer(s) of the article in The Economist had the full copy of the paper and any supplemental materials. No one is listed as the writer, so the article is more of an editorial. Whoever wrote the piece has not kept up with the science on saturated fat.

Most of the fat in butter is of the saturated variety. It therefore stands to reason that butter should have a negative effect on heart health. Indeed, there is good evidence from randomised-controlled studies that replacing butter with plant-based oil can reduce cholesterol.

There is more bad news for butter-lovers. The new study, published on March 6th in JAMA Internal Medicine by authors in Massachusetts and Denmark, relied on data from three long-run trials of American medical professionals. For almost 33 years 220,000 nurses and doctors have been regularly surveyed about their lifestyle, diet and health. Many have died in this time. The authors found that, after controlling for other aspects of demographics, diet and lifestyle, those people who ate the most butter (averaging around one tablespoon per day) had a 15% increased risk of mortality compared with those who avoided the stuff. By contrast, people who consumed the most plant-based oils, such as canola, soyabean or olive oil—all of which have low levels of saturated fat—had a 16% lower mortality rate than those who consumed the least. [My bold]

The kind of interesting thing about all of this is that the saturated fat in butter should run cholesterol up and lead to more heart disease. The lipophobes who wrote this article certainly think so. But those who consumed the most butter did not experience more heart disease.

And, though the study could not show that butter increased the risk of dying from cardiovascular diseases, consuming more plant-based oils did lower that particular risk. Butter-eating was, instead, linked to more deaths from cancer. The authors found that replacing ten grams of butter daily with the same amount of plant-based oil appeared to reduce the cancer mortality risk by 17%. [My bold]

The lipid hypothesis is simply that. A hypothesis that posits that cholesterol causes heart disease, and that saturated fat runs cholesterol levels up, therefor saturated fat causes heart disease. And it’s still only a hypothesis. Never yet proven. And in this case, the evidence does not show it to be true.

One more thing from the article in The Economist that makes the whole thing moot.

Observational studies like this one are rarely cut and dried, however. George Davey Smith, an epidemiologist at the University of Bristol, points out that there are other differences in health-related behaviours between the groups: the voracious butter-eaters contained twice as many smokers, for example, as the butter-avoiders. He argues it is not possible to fully control for such differences, which means some non-dietary factors could be at play. [My bold]

Let me ask you, dear reader, which do you think would contribute more to an early death? Eating a tablespoon of butter? Or being a smoker?

I rest my case.

For those of you who don’t have a firm grasp on what an observational study such as this one really is, I wrote a piece on it years ago explaining why such studies can’t prove causality.

How To Prevent Brain Aging

A really nice study came out this week that takes a deep dive on brain aging. It shows the human brain’s aging process is not a steady, linear decline, but instead follows a nonlinear trajectory with specific transition points.

The research team evaluated neuroimaging data from 19,300 subjects using genetic analyses and interventional studies to figure out how the human brain ages in an effort to identify potential strategies to protect cognitive function.

Traditionally, brain aging has been characterized by a handful of degenerative processes, including glucose hypometabolism, brain atrophy (shrinkage), and the deposition of proteins associated with neurodegenerative diseases. Problem is, these findings often become detectable only in later stages when little can be done about them.

These authors took a different approach and employed functional magnetic resonance imaging (fMRI) to detect earlier changes in brain network connectivity and organization.

In their study of the 19,300 subjects, the researchers discovered that brain networks destabilize following an S-shaped (sigmoid) curve instead of a straight line trajectory. This S-shaped curve revealed three key transition points in the deterioration of cognition.

These findings challenge the conventional perspective of brain aging as a gradual, steady process. Instead, they suggest discrete phases with recognizable neurobiological characteristics. Identifying these transition points provides insights into when interventions might be most effective. And which of the various mechanisms might be best targeted.

Which, of course, is where the rubber meats the road. Let’s look at it.

Subscribe to Premium to read the rest.

Become a paying subscriber of Premium to get access to this post and other subscriber-only content.

Already a paying subscriber? Sign In.

A subscription gets you:

  • • Subscriber-only posts and full archive
  • • Post comments and join the community

Reply

or to participate.