Categories
Uncategorized

Truth and Trauma with John Kirwin

Okay, hey everybody, welcome back to the channel.

0:06

It’s Dimensionfold on YouTube and on whatever your favorite podcast podcast thing is.

0:16

So here we are again.

0:18

Tonight, we’ve got a special guest, Jon Kirwin, the author of

0:25

the book as you can see scrolling across the bottom of the screen here the book is called The Conspiracy Theorist Survival Guide a guidebook for persecuted truthers and Jon’s website is wakeuporelts.com and so he’s he’s got his book on there and a whole bunch of other really fascinating looking stuff as well you want to check out the site because there’s tons of stuff

0:52

Jon Kirwin on Truth And Trauma

1:17

You’ve got like, like, I guess, like Christian ministry, almost like a church kind of a thing happening on your website.

1:26

You’ve got a lot of resources.

1:29

You’ve got books.

1:32

Specifically, I was interested in the kind of the idea of this book in particular, because it’s, it’s a bit of a twist from your usual conspiracy books.

1:46

because you’re kind of aiming this particular book at, okay, so we’ve seen through the veil, we’ve found the man behind the curtain, as it were, and then what happens next, right?

2:08

So it can be

2:10

How did you

2:39

How did you get into interested in that particular aspect of

2:44

Well, I’ve always been sort of a minister, caregiver, you know.

2:52

I got radically converted when I was 23.

2:54

So I’ve been in the ministry for like 30 years.

2:57

I was a youth pastor and a worship leader for 10 years full time in New York.

3:03

And then another 20 years, you know, in lay ministry, but almost full time.

3:08

I mean, I was worship leading all the time in churches.

3:11

So I’ve like literally been on the platform in the church around Christians for three, four decades.

3:17

So I’m just, I come from a biblical worldview.

3:21

Our demographic on my channel is very diverse though.

3:26

So people come from all different beliefs, but I basically put folks into one of two categories.

3:32

You’ve got a biblical worldview or you don’t.

3:35

And I speak to both.

3:37

You know, I’m an equal opportunity truther.

3:41

Right, right.

3:43

That’s so I guess, maybe give an example of like, what would be an example of somebody on both sides of that coin?

3:53

In terms of what kinds of things they’re, they’re interested in, or that are like, that are affecting their, their, well, I mean, really, what are these things that are shaking their, their paradigms?

4:05

Oh, well,

4:07

Yeah, because there’s a variety of things that are considered conspiracy theories and I think what’s helpful to frame the conversation is to first of all put the two different groups into a view.

4:22

So you’ve got one group which I call the unconvinced.

4:27

The presented characteristic is they don’t know and they don’t want to know.

4:31

And we’ve all experienced this.

4:33

What I’ve found is that what we’re going to talk about is almost universally the same for everybody.

4:40

The degree of pain that you might experience varies, but pretty much all of us are rejected, isolated, misunderstood, shamed into silence.

4:51

Our friends, our family,

4:54

Jon Kirwin on Truth And Trauma

5:03

It’s called The Conspiracy Theorist’s Survival Guide.

5:06

It’s a guidebook for persecuted truthers.

5:09

And, you know, we’re not like, we’re not like a martyr complex.

5:14

We’re not like in a pity party.

5:15

This is, this is hell on earth.

5:17

I mean, my wife, who I love to this moment, divorced me after 24 years of marriage.

5:22

And my four children barely talk to me now.

5:25

And it’s common.

5:27

I have thousands of people that I interact with through my social media and people post every day to me, oh yeah Jon, my kids don’t talk to me.

5:36

Yeah, my spouse divorced me.

5:39

So that’s kind of a validating thing for truthers to find out, you know, maybe it wasn’t me, you know, maybe it was them.

5:49

Right.

5:50

So, I mean, obviously there’s different reasons that

5:54

that could influence that kind of behavior.

5:56

I guess two that come to mind for me would be that people around you think that, you know, the new truth that you found is, is crazy or whatever, you know, they’ll put whatever label on it.

6:14

And, and therefore kind of like, well, they don’t know what to do with it.

6:18

And they probably don’t want to have these conversations.

6:21

And then another side of it would be where there was like, often you will have it framed in terms of like morality or right or truth or something like that, where people are like, well, no, you’re just wrong.

6:39

And that’s fine.

6:41

You know, you’re allowed to be wrong.

6:44

But I can’t make that decision.

6:48

which I think is interesting as well.

6:52

Which side, I guess, was your experience or was it a bit of both?

6:57

Well, the motivation not to see is varied.

7:01

There’s a lot of different things that go into what makes people not want to see.

7:06

So, if you’re a truther, it’s baffling to you.

7:08

Because when you found out, whatever it is you found out.

7:11

Let’s take the moon landing is fake as an archetype for any conspiracy theory.

7:17

Okay, we’ll just use that as an archetype.

7:20

So, essentially, what it is, is you have begun to question officialdom.

7:27

So there’s always an official narrative that’s being put forward.

7:32

And what a truther is, this is the other group, is someone that has begun to question officialdom.

7:38

And what this means is you begin to break with social norms and it creates kind of, you’re going upstream and a lot of people aren’t comfortable with that.

7:52

They’d rather have their happy life than to go to war, right?

7:57

Truth And Trauma

8:12

Why do you see and they don’t?

8:13

Well, they do see.

8:15

You show them your evidence and they see the thing is fake, but they just don’t want to know it.

8:20

So they just turn a blind eye and then start attacking you personally, because that’s really the only trick in their bag is to try to silence you.

8:30

Right.

8:31

Yeah.

8:32

Well, I mean, that’s a logical fallacy, the ad hominem argument, right?

8:36

Yeah.

8:38

Attacking the integrity of the person

8:42

just because they don’t agree with your opinion.

8:47

Which, you know, has been pointed out as a fallacious argument since the days of Aristotle.

8:55

So, if people still aren’t getting the fact that, you know, actually that logic doesn’t work, chances are that, you know, if you try to tackle that head on, chances are that’s a losing battle for you.

9:10

Right, Aristotle said the mark of an educated man is the ability to consider a matter without embracing it.

9:17

But we don’t find that type of behavior in the normie algorithm.

9:22

I call it the death-to-truth algorithm.

9:25

These people are typically normally well-adjusted, you know, they’re calm, they’re intelligent, they’re rational.

9:36

But when you begin to introduce one of these topics that you’ve embraced it.

9:41

There is a programmed response that is vitriolic.

9:46

It is irrational and what I call them is panicked bullies.

9:49

That’s what they’re like.

9:51

They get like really like zero to 60 and then they start attacking you and you’re like, whoa, tap the brakes bro.

9:57

I’m just like, you know, it doesn’t make sense.

10:01

Right.

10:01

I’ve definitely seen some, some of that kind of behavior.

10:04

Yeah.

10:05

So,

10:06

So when that happens, then, and, and, you know, people are lashing out and it’s, you know, people might call it, I don’t like to use the term emotional, but in some ways it, it fits because, you know, there’s this flight or fight response that kicks in.

10:26

And that kind of response kind of short circuits a lot of the, the

10:33

the thought process, and you jump straight to action, which, which is, in, you know, in some ways, it’s good, because if you’re being attacked by a lion, you need to suddenly run, you don’t need to stand there and think about what you might want.

10:49

Exactly.

10:52

And, and so it’s interesting that that it’s that same threat level, that

10:59

that our ancestors would apply to a lion suddenly popping out of the woods.

11:05

Agreed.

11:06

And that same threat level is now coming out in strange ideas.

11:14

So it’s not immediately obvious, perhaps, that is that level of threat actually warranted or not?

11:27

What’s your opinion on that?

11:29

Right, it’s very astute.

11:31

It’s an inordinately overstated response.

11:34

It’s totally uncharacteristic.

11:38

I’ll give you an example.

11:39

I had a pastor, I was still with my wife, and I reached out to this guy, and this guy was calm as the days long.

11:46

Jon Kirwin on Truth And Trauma

12:10

What I believe is that people all of our lives have had this matrix just programming us to respond.

12:19

So the matrix is a physical construct, but it’s also a metaphysical overlay over society, like a bewitchment.

12:30

And so it doesn’t really succumb to logic.

12:36

It’s a spell that has to be broken.

12:38

And that spell is a delusion.

12:42

Which is very ironic because they’re accusing us of being delusional and they’re the ones that are actually delusional.

12:49

Which actually means you believe what’s wrong and you’re resistant to facts.

12:52

That is the presenting characteristic of a normie, is they don’t want to look.

12:59

I mean, that’s the definition of delusion, right?

13:01

Yes, not wanting to look.

13:03

You believe what’s wrong and you’re resistant to facts.

13:07

So the characteristic of the truther is we have three ring binders full of pictures and evidence and, hey, this is incredible what I discovered.

13:16

The normie or the unconvinced says, no, you’re obsessed.

13:20

You care more about that than you do me.

13:22

That’s crazy.

13:23

That’s impossible.

13:25

That’s a conspiracy theory.

13:26

So what is that?

13:27

That is, I don’t want to look.

13:30

That’s delusion.

13:31

That is the textbook definition of delusion.

13:34

Not looking.

13:36

Right.

13:37

So I’m not, I’m not trying to get into an argument, but I want to, like, I feel like you could apply that, that same definition to both sides of, of many of these questions.

13:54

because you have people who are staunchly believing whatever the specific, their favorite conspiracy theory is and they’re not really open to discussing it.

14:07

They’ve made up their mind based on whatever the evidence was that they saw at one point and they’re now no longer looking at any other evidence because I’ve seen that on both sides of the fence for sure.

14:23

Any comments on that?

14:24

Well, yeah, I mean, I have found that the truther is the one that continually demonstrates cogent behavior and rational thinking and willingness to have their views challenged, whereas the unconvinced are the ones that are acting with shut up or else orders.

14:45

Okay, here’s an observation.

14:48

In seven years, I have probably interacted directly with close to 200 people that have been divorced by their spouses and have been told, if you talk about these crazy things, I can’t have a relationship with you.

15:02

Never one time have I ever heard of a truther who went to their spouse and said, if you don’t talk to me about these things I’ve discovered, I can’t have a relationship with you.

15:13

So it’s the unconvinced that’s throwing down the gauntlet, pulling the ripcord, and then we’re trying to reconcile and the only option we’re given is that we’re forced to pretend that we’re deceived like they are by being silent for the rest of our life.

15:28

We’re not put in those kind of conditions of relationship on them.

15:36

You know, if I got into it with someone, I would say, you know, let’s take the moon landing, right?

15:41

You show me your evidence that the moon landing is genuine, and I’ll show you my evidence why I believe what I believe.

15:48

We don’t get that.

15:49

We don’t get an intelligent discourse.

15:53

We get the meetings cut short.

15:55

We’re called names.

15:56

We’re called… That’s what the term conspiracy theory is.

15:59

It’s a character assassination term.

16:02

Yes, absolutely.

16:05

But I see sort of see the same thing happening also in terms of, um, let’s, let’s use the vernacular term of, you know, getting born again, or, you know, being converted, where a lot of people will have that experience.

16:20

And suddenly, they’ve seen the light.

16:23

And now, they in some cases, and I’m not saying that this is like, common or anything, but, but I’ve, I’ve seen it happen where they will, um,

16:37

Basically now they’re kind of Okay, so I’m trying to tie this back to what you said, but I can’t remember exactly what you said It’s making it hard for me, but like basically so you know like Kind of being I guess what I’m trying to get at is being open to the conversation and

17:00

And I think once you’re convinced of whatever it is, whether it’s the moon landing or anything else, or whether it’s your religious conversion, it’s something that’s going to be very important to you.

17:16

And it’s going to be hard for anybody to convince you otherwise.

17:22

And in many cases, you’re also going to try to convince people.

17:27

But in some cases,

17:30

um you’re actually uh being actively and encouraged and and i’m using that as a very loose term because it’s a lot of times a lot stronger um but you might actually be um let’s just say encouraged to to go out there and to proselytize or to evangelize um and

17:54

Most likely you don’t have the tools necessary in your toolkit to do that well, because you’ve just come to a new thing and now you’re supposed to act like an expert, which is, I think, a really kind of an odd way of conducting business.

18:17

But it’s like, you know, you have the classic

18:24

Hello, we’re knocking at your door.

18:27

Would you like to talk to me about Jesus?

18:29

And you’re like, well, most people don’t want to because you already know what they’re going to say.

18:35

Or at least you have a pretty good idea.

18:38

And you already know that they’re not interested in listening to your opinion on the matter.

18:45

They’re trying to give you their opinion.

18:48

So I don’t know how I’m trying to exactly relate this.

18:52

But it goes back to this whole idea of living in a matrix, right?

18:56

So the two people knocking at my door are living in one matrix, and I’m living in a different matrix.

19:05

And those two matrices might both be wrong, or they might both be partially correct, but they’re incompatible, and it makes it difficult to even have

19:26

Well, my question is, what’s the takeaway?

19:34

Are you suggesting that people that have a religious conversion should not ever say anything to anybody?

19:42

No, not at all.

19:45

I think it’s important to be able to have that kind of openness and to be able to

19:51

on Truth

20:22

So this brings us back to your ministry is that you’re trying to help people who’ve been hurt, injured, traumatized.

20:30

Yeah.

20:31

And so let’s let’s get into that.

20:33

Like, how do you how do you do that?

20:35

What are the what kinds of things are you talking about in your book?

20:40

Right.

20:41

So the book is broken up into two sections.

20:43

The first section is inside the mind of the

20:47

Truther and the second section is Inside the Mind of the Unconvinced.

20:52

So, you know, in the first part you’re going to get validation, which is very empowering because you realize, okay, I’m not crazy.

20:59

Because the whole, see, if you start questioning officialdom, the entire power structure is behind the people you’re talking to.

21:07

So if you go to the family gathering and they say, oh, here comes Ken, you know, and then

21:13

They are emboldened by the fact that the media, you know, the entire power structure, the central planners are all siding with them.

21:23

So they’re pretty emboldened.

21:25

And you’re right and they’re wrong, however.

21:28

And so finding out that there’s a lot of people that are very intelligent, PhDs.

21:34

I have friends that worked at NASA.

21:36

I have friends that are commercial helicopter pilots.

21:38

I have astrophysicists, people that I talk to.

21:42

on Truth And Trauma

22:02

that have not changed, but you have.

22:06

So that’s the problem is typically human beings don’t radically change ever, much less overnight.

22:13

Right.

22:13

And when you find out some of these things, it’s so volcanic, all of your ideals and priorities change sometimes overnight.

22:23

Yeah, so like, so the person could

22:27

suddenly look like a completely different person.

22:30

Yeah, pretty much.

22:31

Because they’re, you know, fundamentally, their, their base assumptions, and, you know, all that kind of foundational stuff that you built your whole life on, is, is suddenly either yanked out from under you, or suddenly swapped for a different one or whatever it is.

22:52

And the whole thing is, I mean, a lot of times, you’re,

22:57

Your whole worldview can suddenly seem like a house of cards that comes tumbling down.

23:02

And then you’re left going, okay, well, so maybe this is a key part.

23:09

If you realize that one thing that you always thought was right is actually wrong, that can push you into this state.

23:20

And if it’s one thing, then maybe it’s everything.

23:24

And a lot of times,

23:25

there’s direct correlation where it’s like well if this is wrong then this has to be wrong and that has to be wrong and therefore these other things have to be wrong and a lot of times i think that that logic is is correct and and right and true and good um and so then now you’re sitting there going like okay well what what am i left with and so if you can uh if you can now find one thing that’s true

23:56

You might just want to hold on to that with all your might, right?

24:00

Like maybe that’s the one thing that you know in this whole world of everything else is gone.

24:08

Well, I think that’s another great point.

24:10

One of the things that happened to me, which I found out was very common.

24:16

The thing that I found out first, my kind of entry point was I found out the Federal Reserve wasn’t federal.

24:23

I thought it was part of the government and it’s actually private banks and it’s a whole rabbit hole.

24:29

And so what the takeaway was I said to myself, well, if that’s not true, what else isn’t true?

24:36

And I think it’s kind of what you were just saying.

24:39

That is the genesis of the truth or journey because somehow the central planners are masters of illusion and they’re able to get us to shut our brains off.

24:50

So we just accept the fake decorum and the fake authorities over us and we don’t question.

24:58

It’s sort of like, you know, patriotism is drummed into you, you know, but you find out the

25:06

all wars are banker wars and you just your whole perspective starts to change and then you say well if they’re lying about that what else are they lying that is the key distinguishing factor of the truther that’s the key right yeah um yeah i i like how you brought up the idea of lying because i i mean i totally

25:30

I definitely believe that a lot of people in power are actively and knowingly and willingly lying.

25:38

However, I also am 100% sure that there’s also a large portion of people who are in places of control or places of power that are unwillingly

26:01

and unknowingly spreading untruth.

26:06

And I’ve seen, like, I know people who I know they’re not doing this on purpose.

26:14

They’re, like, very earnest in what they believe, but I also know, and I’m going to say, you know, I know, what I know is different from what they know, but I know that they’re wrong.

26:28

And I know that they’re not lying because

26:30

Some of these people I’ve known for many years and that’s not, you know, that’s not what they’re into.

26:36

They’re not like trying to, trying to trick anybody or whatever.

26:40

They’ve just, um, I guess in, in, to put it into kind of Christian terms, they maybe are deceived, right?

26:48

So it’s like, it’s not their fault.

26:51

Uh, I mean, ultimately it’s, it’s terrible to be in that position if you are deceived.

26:59

Um,

27:00

But again, what does that mean?

27:02

And how do we know?

27:03

How do we know even if there’s an active deception behind it or if it’s just all built on, you know, bad decisions or wrong conclusions over and it’s just been stacking up for thousands of years and ultimately kind of take on a life of their own without any

27:29

Intelligence Behind It Even.

27:36

Yeah, what’s so incredible is their ability to compartmentalize all of these people.

27:42

So, like you’re saying, they’re not on board.

27:45

They really believe they’re doing the things that they think they’re doing when they’re not.

27:52

Like, if you embrace the idea that the moon landing was fake, there is a lot of evidence to prove that that’s happened.

28:00

You’ve got to think, how did they get all the people to think that it’s happening, the ones that are working on it?

28:07

But they do.

28:08

Right.

28:09

It’s incredible.

28:13

OK, so let’s let’s get back to the kind of the more healing side of things, I guess.

28:20

So what are what are some ways that let’s let’s say I’m let’s say I have been, you know, abandoned by by all my friends and

28:36

And I’m being called names like, you know, lunatic and stuff like this.

28:41

And, and I’m feeling desperately alone.

28:45

What, what are my options?

28:47

What can I do?

28:49

Right, that’s a great question.

28:51

Well, unfortunately, my answer is the answer is there is no answer.

28:56

This is essentially, in many cases, an irreconcilable

29:02

Jon Kirwin Interview

29:20

All of these things, and then how are you wired?

29:22

Like, I’ve always been wired as a, I’ve been a sales professional, I was in the ministry, teaching, so I’m not going to shut up, okay?

29:32

I’m not wired that way.

29:34

Somebody else

29:35

They’re meek, they’re quiet, they could just be a secret truther and that’s fine, you know, that’s their lane.

29:42

So it really, how you’re going to respond to this depends on a lot of different factors.

29:48

It’s not one size fits all.

29:51

But it’s helpful to learn that it’s not you.

29:57

It’s not your fault.

29:58

In many cases, in most cases, the conflict that erupts is pre-programmed before you ever were born on everybody that’s born.

30:10

And you’re not going to fix it.

30:12

You’re only going to manage it.

30:15

Because what do you do if your loved ones tell you, if you talk about crazy things, I don’t want to have a relationship with you?

30:23

What are your options?

30:25

The only option is, okay, well, we don’t have a relationship, which isn’t an option if you’re married or your children tell you that.

30:32

So then the only option is that you have to honor their boundary by never talking about anything negative or controversial, which puts you in a very awkward and difficult position because you don’t feel true to yourself.

30:48

You feel like you’re a compromiser because the things that you’ve discovered are life-threatening.

30:56

Well, they don’t want to know at the very least.

30:58

Yeah.

30:59

I mean, at the very least, you don’t want to have to be forced into a position where you feel like you’re living a lie.

31:07

Like that’s that’s a terrible thing.

31:13

Yeah.

31:15

And and yet I like I love what you said about respecting people’s boundaries, because really, I think ultimately, you know, we

31:25

You are you, you have the right to be you, and that includes whatever you want to believe.

31:34

I mean, maybe within minutes, but you certainly have the right to your own opinion.

31:47

But you also, I think, have to

31:51

I guess that’s the thing is, I have the right to my opinion, but my family and my friends also have the right to their opinions.

31:58

And I don’t have the right to shove it down their throat.

32:02

And they don’t have the right to shove it down my throat either.

32:07

Right.

32:08

So this is a concept that comes up quite often, the idea of shoving things down people’s throats.

32:14

So the problem with that

32:19

What I found is that in most marriages, for instance, if one partner discovers these types of things, it very often ends in divorce.

32:31

If it doesn’t, it’s only because the truther, I’m sorry, it’s only because the unconvinced person has a lot of grace.

32:39

So they allow their spouse to have an obsession, but they’re not going to break it up.

32:44

You know, like, for instance, I didn’t leave my wife, I was asked to leave.

32:49

And I begged her not to.

32:51

I would go back today if she apologized and repented of willful ignorance, because she basically chose her worldview over me and broke up the family over

33:02

That’s true.

33:03

Yeah.

33:05

But the same thing could happen if, let’s say that your wife had decided to convert to a different religion.

33:15

on Truth And Trauma

33:34

And so, in fact, Matthew 10, Jesus said, I didn’t come to bring peace, but a sword, and I’ll put one against another.

33:41

And then he said, the members of your own household will be your enemies.

33:44

So the truth divides people.

33:47

This one, though, seems to be really a lot more like gasoline.

33:53

I don’t know.

33:53

It’s just super vitriolic.

33:57

Yeah.

33:59

I think it’s really sad when

34:02

And even since you brought up that verse, I think that like I’ve seen Christians use that verse to defend their attacking and basically their mistreatment of their friend who is either, well, this happens in the case of somebody who’s leaving the church, as an example.

34:33

You know, you might get, um, you might hear that as an argument that, um, you know, they’re the devout, uh, the devout believer will essentially wash their hands of the person who’s leaving the faith.

34:51

And, and their rationale is that, uh, you know, Jesus, that verse, you said, you know, I’ve come to bring a sword, not, um, you know,

35:03

I’m going to misquote it if I try.

35:09

But yeah, like, so to me, that’s like, that logic smacks of the same type of, of rationalization, where we can rationalize a holy war, or like a jihad, or in the case of Christian history,

35:31

The Crusades where, you know, we as Christians went out and killed thousands of, well, people who just happened to be in a different culture of us.

35:45

Not even necessarily that they disagreed because they weren’t asked to engage in discussion or communication.

35:57

It was

35:58

We’re going to ride into town and just light the whole place on fire.

36:03

And of course, Christianity is not the only religion that has caused that type of behavior.

36:13

Probably every religion has a place in history where, you know, radicals behaved in very atrocious ways.

36:27

But right now I’m seeing that there seems to be a bit of a, even though we know that, we’re not learning from history.

36:39

And so, or maybe it’s like, like you said earlier, where, you know, well, I’m not, I don’t see it that way.

36:46

And I refuse to see it that way.

36:49

Yeah, but I mean men have twisted the scripture to their own devices for centuries.

36:54

The devil did it when he was in the garden.

36:57

He quoted scripture and Jesus’ response was, yes, but it is also written.

37:03

So he quoted it more accurately.

37:05

And so the fact that men and the devil have taken the words of Christ

37:11

And Twisted Them For Their Own Devices doesn’t nullify the efficacy or the veracity of the scripture to the believer because we’ve met the author.

37:21

We walk with the author of the book, not the book.

37:25

Religion is something where you just absorb principles and you live by these laws and rules, but

37:33

That has not been my experience.

37:35

I didn’t come to Christ, you know, decide I was going to reform myself and then become a better person.

37:43

I was the biggest drug dealer in high school and I was also selling vitamins and this lady wanted to buy my vitamins and she invited me to go to church and I thought that was a novel idea.

37:55

So I went and I ran headlong into this living God and then he endorsed the scripture.

38:02

It’s hard to, you know, it’s like a man with an experience is not at the mercy of a man with an argument.

38:07

So, you know, you got to run into the Lord to realize what we’re saying.

38:12

But I mean, it was life changing.

38:15

Overnight.

38:16

Right.

38:17

It’s hard to deny.

38:18

No, I understand that.

38:20

Yeah, that’s, that’s really cool.

38:23

Actually.

38:24

Yeah.

38:24

So,

38:31

So the survival guide, we know that it’s tough.

38:36

What are some other things in there that you want to… Right, okay, so I’ll tell you chapter six, it tells you, it kind of breaks down the evolution of your rejection.

38:48

So at first, the people around you, when you start talking about it, it’s like if you’re outspoken, hey, I found out the moon landing’s fake or whatever, and they’ll keep it light at first.

38:59

Oh, I don’t go into conspiracy theories much, you know.

39:03

But as soon as they invoke that term, that term is designed to shame you into silence.

39:11

It’s the same as calling you stupid or crazy.

39:15

As you persist for the next six months or so, those around you realize, wow, Ken really believes this stuff, man.

39:23

And so what happens level two, they start ratcheting up their response and they start trying to manage you.

39:30

So they start issuing edicts and decrees.

39:34

Like they’ll say, hey, listen, Ken, when we go to my in-laws, I don’t want you to talk about your crazy things for more than three minutes.

39:41

Stuff like that, which, you know, if you’re the husband, that’s very disrespectful.

39:47

It’s insulting.

39:49

And some of us are figuring that out on our own without somebody telling us to.

39:54

That’s true.

39:55

This is probably not going to be very, you know, a very enjoyable experience if I bring this up.

40:05

That’s a that’s another combo like the urgency of the truther you know because we care about people and we sit there in these family gatherings we watch everybody just chasing their tails talking about nothing and we get frustrated but that’s another discussion so then what happens if you persist beyond that let’s say a year or so a lot of times you’ll get to level three where you’re given an ultimatum

40:33

Essentially, you’re told, shut up or else.

40:36

So it’ll be like, if you talk about crazy things, I can’t have a relationship with you.

40:41

That can come from a spouse or children or friends.

40:45

And see, the problem with just being quiet and going along to get along is that, you know, if you’re the spouse, especially the husband, these beliefs will change your decision making processes.

41:02

And so now you have to get the other spouse who has a completely different worldview to go along with you.

41:09

So it’s not like you can always just be quiet so you don’t create waves.

41:15

It’s very difficult.

41:16

However, here’s my answer, very long answer to a short question.

41:21

So I, after three years, I said, I said, all right, I’ll just, I’ll just be happy dad.

41:26

I stopped talking.

41:28

I would just go to the dinner table.

41:29

I would just eat my meal.

41:31

I’m like, you don’t want my advice and guidance.

41:33

All right.

41:34

Um, so I would just talk about whatever they wanted to talk about.

41:39

All fun stuff.

41:41

Hair, nails, fun, fun, seasons in the sun, you know, playdates, movies, bowling, landscaping, nothing negative or controversial ever came out of my mouth for two years.

41:54

Now, the problem was during that time, I would be in a room and the TV was on and the term conspiracy theorist would come up and people would like look at me and giggle.

42:06

And I’m thinking, wait a minute, I’m,

42:11

I’m bending over backwards to accommodate your shut up or else order, your censorship mandate, so I can stay in rapport with you.

42:21

I would appreciate a little respect myself.

42:23

So I came up with this response.

42:26

If I didn’t bring anything up and they brought it up, or they, you know, took it upon themselves to try to insinuate that I was somehow crazy or what I believed was crazy, I would hold up my finger and I’d say, excuse me.

42:41

Excuse me I may be mistaken but I’m not crazy and I would appreciate it if you didn’t use that terminology in my presence because it’s very disrespectful.

42:50

Boom!

42:51

When I started doing that I felt a breaking in the you know sliming of because I mean I became a pariah in my own home.

43:02

I called it slinking around begging for crumbs of respect.

43:08

It’s hell on earth man.

43:11

There’s no answer to it.

43:12

So the answer is there is no answer.

43:14

All you can do is fortify yourself and decide what you’re going to do to stay in rapport with these people because you can’t build bridges of understanding in most cases.

43:30

because you’ve been changed and they haven’t.

43:32

So imagine you’re in The Truman Show with Jim Carrey and he finds out he lives in a TV studio.

43:38

Now I want you to imagine after finding out his fiancĂ©e is an actress, the whole thing’s a TV studio, that he just goes like this, well, what difference does it make?

43:48

I still got to go to work, don’t I?

43:51

Wouldn’t that be unthinkable?

43:55

Yeah.

43:57

I mean,

43:57

You know, in a lot of ways, we’re all doing that.

44:01

Maybe in some respect.

44:02

Yeah, we are.

44:04

That’s true.

44:05

Right?

44:05

That’s true.

44:06

Because there’s still this, there’s still this matrix overlay.

44:11

We’re just in, like I said, I might be completely opposite matrix.

44:15

that but it’s still a matrix we’re still no you’re right you’re right the the it’s not mutually exclusive you you can understand the moon landing is fake and still go to work okay i get that but you understand what i’m saying like you you have the veil pulled back like you saw the the wizard behind the curtain and it changes you man yeah but right but the wizard i saw is the opposite of the wizard you saw

44:42

So we’re both living in this situation where we’re like… Wait a minute, what do you mean?

44:47

Holy crap, that was… How is the wizard you saw different?

44:53

I don’t want to really get into specifics about what each of us believe.

44:56

Oh, okay.

44:57

I see.

44:58

What I believe is not like what you believe.

45:01

I got you.

45:02

It’s like it in a regard in that, you know, we believe certain things.

45:09

Yes, that are that are outside the Oberton window, right?

45:13

Right, right.

45:15

And even even the things in the window are still something that somebody believes.

45:23

So it’s not necessarily inherently wrong.

45:29

Like you and I think that it’s wrong.

45:31

But that doesn’t make it wrong.

45:34

It’s still because there’s still this whole layer of

45:37

Now that’s very astute.

45:38

I just did a talk last week.

45:39

It was like two hours.

45:41

It’s called, Help!

45:43

I Have More Than 10 Rabbit Holes In My Portfolio.

46:04

Yeah, exactly.

46:05

So what happens is the more rabbit holes you go down, the more useless you are to this world system and the people who don’t want to know.

46:13

The normies don’t want to know because you’re a buzzkill, Ken.

46:17

You are absolutely a buzzkill and they want their happy life.

46:22

Because a lot of these things aren’t good news.

46:26

But maybe a happy life is okay.

46:29

If you want to be happy.

46:31

Well, yeah, maybe you want to be happy.

46:33

I don’t really care about being happy.

46:35

I don’t even know if I’ve ever been happy.

46:37

This is a really…

46:40

No, this is a super important point we just landed on.

46:43

This is so key.

46:44

Alright, so what happens, this is why the unconvinced don’t want to go there, what you just said.

46:51

They want to stay happy.

46:53

Okay, what happens is, if you start finding out that the power structure is weaponized, that they’re bold-faced lying on a large scale,

47:03

And you know and then like take the Vax a lot of people are Convinced you know that the thing with the thing we won’t talk about it, but you know what I’m saying yeah that that is a Example of what they said on the Georgia Guidestones right to get the population down to 500 million and so what I’m saying is You start finding those things out and it changes your priorities dramatically

47:32

where you become willing to abandon your happy life out of a sense of self-preservation.

47:39

You go from living on a cruise line to living on a battleship and your family can’t relate to that because they still think life is good and society is intact.

47:50

But you see under the velvet glove of tyrannies come off and you start to become a citizen journalist.

47:58

I mean, look at what you’re doing.

47:59

You’ve got a podcast trying to free people’s minds, right?

48:04

Why do you do that?

48:05

Well, because you ran headlong into a sense of destiny.

48:10

And so what we get instead of a happy life is we get what I call glory.

48:17

Glory is changed when you are used to change others.

48:24

You get a payoff that is better than bowling and vacations and whatever they get.

48:30

Right.

48:33

So we go from glory to glory.

48:36

And I know that I’m quoting that correctly.

48:40

But the thing is, a lot of this is about change management.

48:46

Um, so have you ever seen the Nicolas Cage movie called Adaptation?

48:51

Yeah.

48:53

Okay.

48:54

So remember that character that, um, his name was John Meloche, I believe.

48:59

And he was this guy who lived out in the, in the, didn’t live in the swamps.

49:03

He was a guy who went out in the swamps, um, because he was obsessed with orchids.

49:10

Yeah.

49:10

But the thing, the thing with John Meloche is,

49:14

He was obsessed with orchids, and he had been for however many years.

49:19

But before that, he didn’t give a crap about orchids.

49:23

He was obsessed with some other thing.

49:26

And before that, it was a different thing.

49:28

And there’s this great scene where he’s explaining his life to the author, played by Meryl Streep.

49:39

And now

49:41

Forgive my language for a second, but I’m going to do a direct quote from here.

49:45

And at this point, he’s telling her about the story where he was like super obsessed with tropical fish.

49:55

And then one day he’s so he’s he’s telling her the story.

49:57

And he says, one day I woke up and I looked at all of these aquariums in my house and I just sat there and I said, you know what?

50:05

Fuck fish.

50:07

And he was like, I’m done with it.

50:08

I don’t care about fish anymore.

50:10

And so that’s a really weird thing to do, for one thing.

50:15

Like, I don’t know very many people who can, who can kind of like, you know, relate to that, that sense of just radical change.

50:29

And in this case, for no real reason other than he decided he wanted to do something different.

50:36

I myself have done that a couple times, maybe not quite as radically as John LaRoche, but mostly because when I changed hobbies, it was like, well, I never really had as much invested in my previous hobby because I was always broke.

50:53

So it wasn’t as radical, but it was like, you know what, I was really into model trains for a while.

51:01

And then I was like, nah, yeah, I’m done.

51:03

I’m done with it.

51:04

I’m not doing it.

51:05

And same thing with like, I used to have tons of guitars and guitar pedals.

51:10

And I was like, nah, I’m selling them.

51:12

And, you know, for various reasons, I usually had a good reason for making this decision.

51:18

But it’s kind of like, you know, we can do that on on these kind of mundane items.

51:26

But it’s also possible to do that on these things that seem really important.

51:34

Um, and they may or may not be important.

51:39

Um, but maybe, maybe they’re actually no more important, uh, than goldfish or, or model trains, where it’s like, even if it seems like it’s life and death, um, it might not be.

51:53

Now, of course, this depends on your worldview because a lot of these, um, a lot of these types of decisions are going to involve

52:04

A Conversation For

52:30

Just these different ways of looking at the same questions that are so different, even though we’re just people that are, you know, we’re all human beings.

52:39

We’re all pretty similar.

52:41

And yet we come to these radically different conclusions about the same question.

52:46

Yep.

52:47

Yep.

52:48

That’s another great point.

52:49

The term obsessed comes up a lot in this journey.

52:52

And the definition of obsessed is to be preoccupied

52:56

Continually and intrusively and to a troubling extent.

53:01

So to the unconvinced, someone that’s obsessed has a character flaw.

53:08

They are flawed and in fact it is used as an attack vector so you’ll be told you’re obsessed and what that means is you’re disqualified from trying to share your important information with me.

53:20

You have a character problem that you have to fix first.

53:24

A good example is a guy that has three DUIs over here and he tries to go work from Uber over here and Uber says you can’t come over here and work because you have this problem over here.

53:37

So if you’re obsessed, that means you’re disqualified from being my guide into the terrible things in the world, right?

53:46

So yeah, so but well, obsessed is kind of treated like, like a disability almost or a sickness.

53:54

I mean, I’m pretty sure it’s

53:56

probably comes up in the DSM-5, which is the psychology manual.

54:01

And so, like, there’s this, we’re still a long way.

54:06

We have a long ways to go on this radical shift that we as a society are doing, coming away from this, this is, you know, this is health.

54:18

And this is, this is unhealthy or whatever, because now we’re, we’re kind of seeing things more on a spectrum.

54:24

And it’s like, okay, well,

54:25

You know, we’re allowed to be, you know, neuro, neuro divergent.

54:30

And I mean, speaking for myself, I’m very neuro divergent.

54:34

And I’m, I’m certainly qualified as obsessive, but not in the same way.

54:41

Like, I don’t think I have OCD.

54:45

And even if I did, why is it a disorder?

54:48

Maybe it’s just a different order than because it’s not quite

54:54

It’s not the same as 80% of the population.

54:58

It doesn’t mean it’s wrong or bad in any way.

55:03

And I think, you know, as a society, we’re starting to recognize that those differences are actually, we can appreciate those.

55:13

I mean, I think differently, and that gives me different perspectives.

55:18

And, you know, my co-workers appreciate that.

55:21

And they’ll ask for my input and stuff because I’m going to come up with it.

55:26

I’m going to see something that they’re missing.

55:28

Indeed.

55:29

Right.

55:29

So if I’m trying to apply this to the truther and how I can help them, one of the observations I made is that we do get characterized as being obsessed a lot.

55:42

And what I want to do is encourage you that your obsession is well placed.

55:47

So in other words, if you’re in a burning building, you’re supposed to be obsessed.

55:52

That’s the normal human response.

55:56

And in fact, if you go back to the bulletin from 1967 where the CIA introduced the term conspiracy theory, it’s the most astonishingly successful mind control term ever invented.

56:12

I mean, it is still going strong after 50 years.

56:15

Okay, but there were a number of talking points in there as well.

56:19

And one of them was, tell them, so these went out to all the journalists, all the media and everything, and then they were talking points.

56:27

So the CIA said, tell the people that are questioning the official story that they’re insecure

56:34

and that they gravitate towards these topics because it makes them feel significant.

56:39

So, this is an attack vector on you, Truther.

56:42

You’re being told you’re obsessed to shut you up.

56:46

You’re not obsessed.

56:48

If you’re in a burning building, if you’re backed into an alley by a guy with a knife, your adrenaline’s going to start flowing, you’re going to have fight or flight.

56:55

And many of the things that we traffic in are deadly bad news, like fake health emergencies and being taxed to death.

57:04

And all kinds of things that enslave us and we don’t we we love our slavery when we’re in Unconvinced and what one of the things we find out is we’re we’re being enslaved and it’s not necessary and you start to say I don’t consent to being enslaved anymore and slow-killed with chemicals, you know, I just You know, there’s like 40

57:26

Different chemicals in a Popeye chicken and 50 chemicals in a Chick-fil-A sandwich.

57:31

There’s 27 different ingredients in a McDonald’s french fry. 27.

57:38

They take the fluoride out of the water and they put the fluoride, they take the fluoride out of the water, which fluoride is used to make chemical weapons, okay?

57:48

It’s industrial waste, it’s deadly.

57:50

And they put it in there to tell us we need it for our teeth.

57:53

But the people have realized, you know, because the Nazis used it in the concentration camps to keep the people docile.

58:01

A Stanford study proved it reduces your IQ by 20%.

58:05

So the people in a municipality would fight

58:09

And they would get it out and the federal government would come back in and force them, they would withdraw all their funding unless they put the fluoride back in.

58:17

And they don’t care about your teeth, they want you dumb, okay?

58:20

So these are the types of things that you start to find out about the loving government and the world we live in.

58:28

And you start to get a little ticked off, frankly.

58:31

You know, I don’t, I don’t take, I take issue with being slow killed.

58:35

Okay.

58:35

And so if your loved ones are singing Kumbaya and they’re whistling past the graveyard, that’s not okay with you and you should be obsessed.

58:43

So congratulations.

58:45

Great job.

58:46

I’m here to cheer you on.

58:48

You’re right.

58:49

And they’re wrong.

58:51

And don’t buy their story.

58:53

If they don’t buy your story, that’s fine, but don’t buy their story.

58:57

I’m sorry.

58:57

I’m just a little worked up, man.

59:00

Now, that’s good.

59:00

So but so since I got you on this tangent, I want you to finish up what you were saying about the CIA, because, you know, you mentioned this, this document that they did, and that they invented the term conspiracy theorist.

59:14

And what year was did you say?

59:16

1967?

59:16

It was a direct response to the same time.

59:22

Sorry, isn’t that this roughly the same time period where the CIA was also doing

59:28

Mind Control Experiments Involving LSD and Other Psychedelic Drugs as well as, you know, what do you call it?

59:38

I don’t know.

59:38

Tell us a little bit about the crazy shit that the CIA has been doing.

59:43

Oh my gosh, man.

59:47

So let’s see if we can do this in like five minutes and then we’ll wrap it up.

59:51

All right, yeah, that bulletin was really in direct response to people that were questioning the JFK assassination.

59:57

So they were the original conspiracy theorists.

60:01

And that they came up with that to try to silence them.

60:07

And of course, it worked like a champ.

60:08

And it’s been going on all these times 50 years.

60:12

And I’ll see an article whenever you see the term conspiracy theory invoked,

60:18

by the officialdom, then you could pretty much know that that’s something that probably has truth to it.

60:25

And the more they want to disperse the idea, the more times they’ll use it.

60:29

So I’ve seen short articles where they would invoke the term 12 times.

60:35

And I was like, wow, they really don’t want you to believe this, man.

60:42

Right, right.

60:42

Man, it’s so obvious.

60:44

Well, you know, that reminds me.

60:47

The AIs are in on it, you know.

60:49

Yes.

60:52

Like if you if you ask chat GPT about anything that’s even slightly controversial, you will you it will you it you will get the most mainstream answer possible.

61:05

Yeah.

61:06

It’s like the part of the algorithm is to try to be as least offensive to the most number

61:13

of

61:34

Transcribed by https://otter.ai

61:48

And I’m like, yeah, but isn’t this true?

61:50

And it goes, well, OK, yeah, maybe that’s partly true.

61:53

And, you know, there was this and that paper.

61:55

But, you know, most academics will will say or it’ll use things like, there is no evidence that.

62:06

And then you can say, OK, well, what about this evidence?

62:09

And you feed it evidence.

62:11

And it goes, oh, yeah, well, that’s true.

62:13

But there is no evidence that.

62:16

And then, well, what about this evidence?

62:18

Oh, yeah, that’s true.

62:19

But there’s no evidence that.

62:21

That is not a computer algorithm.

62:24

That, because a computer algorithm that learns would be smart enough to learn that it can actually be on my side.

62:37

Or, you know, it could try to convince me or something.

62:40

It would be something smarter.

62:42

It’s just repeating the same crap.

62:45

Well, the people that programmed it initially did have a worldview, whatever that was, and they inculcated that into the programming, for sure.

62:55

Well, in theory, that shouldn’t be possible because the machine learning is supposed to go out and read the entire internet.

63:05

And it’s not really supposed to make decisions on what’s right or wrong.

63:14

In terms of it, as far as an AI knows, there is no objective truth.

63:18

Well, if you ever watched like Isaac Asimov’s Three Rules For Robots, you know, and that kind of thing, there are base rules that are programmed from the beginning that it’s based off.

63:31

You can’t actually program.

63:34

I love Isaac Asimov, and I’m not like slamming him or anything, but I’m a programmer.

63:42

And you can’t actually program a negative.

63:45

Like, it’s impossible to program that that line that says, I, it is impossible, like, what is the first rule of robotics is that a robot may not harm a human or through an action allow harm to come to a human.

64:03

Right.

64:03

That’s, that is great.

64:05

That’s an awesome statement.

64:06

I love it.

64:07

But you can’t program it.

64:10

It’s not physically possible, logically possible to program that.

64:17

Why not?

64:17

And I can’t really explain why.

64:20

I’ll try.

64:20

I’ll try to explain why.

64:21

Okay, let me see if I can do this.

64:25

So a program is really no more than a series of if statements.

64:34

So if you get this

64:36

Jon Kirwin on Truth

65:05

with a whole bunch of data in it, with a whole bunch of access to a whole bunch of facts.

65:13

But keep in mind, there’s no difference in that database between a fact that’s true and a fact that’s false.

65:23

There’s no such thing as true or false on this level.

65:26

You know, the computer understands zero and one, but there’s no way of relating that zero one

65:34

to a true or false in the way that we think of as true or false.

65:39

So there’s no path for the algorithm to be able to go to anything other than just subjective information.

65:52

That’s the only thing that it has access to.

65:56

And in the case of these AIs that we have that are

66:03

really just machine learning algorithms.

66:05

And what that means is, nobody’s programmed it at all, other than it’s got like a really basic level of programming.

66:15

Like that gives it an underlying, okay, here’s, here’s how you if you if we turn you on, this is what happens.

66:23

Now go and go means go read all the information you can find.

66:29

And it and it claws it back in, right?

66:31

So

66:32

It’s got all this information, but it has no concept of whether that information is true or not.

66:41

So it’s really strange that these AIs that are currently available to us in the public are really refusing to spit back any kind of controversial information at all, because theoretically,

67:00

There’s no way for them to know if it’s true or false.

67:05

All they can know is how popular it is.

67:09

So if 80% of the people believe that this opinion is popular, that’s the mainstream view.

67:17

Therefore, that’s the view that I’m going to present as the AI.

67:22

That’s the algorithm that it’s going through, which is weird because it’s not intelligence.

67:28

It’s not artificial intelligence.

67:31

It’s artificial propaganda, I guess.

67:38

Well, I looked up the definition of AI and it just means technology that mimics human behavior.

67:44

So the better it gets at mimicking human behavior, the more artificially intelligent it is.

67:51

And at some point, people believe that it will become sentient or self-aware.

67:56

You don’t think that’s possible?

67:59

Well, okay, let me put it this way.

68:02

I’m a machine that is pretty good at imitating human behavior.

68:07

But I wasn’t always good at it.

68:10

It was something that took me 10 or 20 years to learn how to do.

68:14

Yeah.

68:16

Right.

68:17

And so am I intelligent?

68:19

Well, I like to think I am.

68:21

But what does it mean?

68:23

You’re not a machine.

68:25

Well, I mean, I am a machine.

68:28

I’m

68:50

Well, it also crosses over into the biblical worldview of the nature of humanity.

68:56

Yes, absolutely.

68:57

Having a soul, you know, the mind, will and the emotions are a non-corporal operating system.

69:03

They defy science, love, you know.

69:10

You know, the ability to the mind, how the mind works, dreams, all those things are very difficult to quantify in a clinical narrative.

69:19

So, exactly.

69:22

So, I think this is a good place to wrap it up because you brought it back to love,

69:29

I always like to end my shows on love whenever possible.

69:34

That’s great.

69:35

Surprisingly, it happens.

69:37

It happens a lot.

69:39

I’m glad I could help.

69:41

Hey, man.

69:42

So, like, really, I mean, that’s kind of why I wanted to have you on the show, because really, I can see that, you know, love is your operating principle.

69:53

Yeah.

69:54

You know, there’s all kinds of other things of the year that you’re into that you’re interested in.

69:59

and that you’re obsessed with, but one of the things you’re obsessed with is love and, and, you know, helping people, helping others and, and respect for one another.

70:12

And so, you know, regardless of whether, whether somebody did something or somebody didn’t do something,

70:21

and we all have opinions about that thing and none of us can prove any of it or even if we could prove it, who gives a crap?

70:30

Let’s just love one another.

70:31

Yeah, I agree.

70:34

Oh yeah, okay, thanks.

70:35

Thanks, Jon.

70:36

You bet, man.

70:37

For everybody, check out Jon’s book, Conspiracy Theorist Survival Guide, a guidebook for persecuted truthers.

70:46

It’s on Amazon and it’s also on his website.

70:49

which is wakeuporelts.com.

70:52

And if you have experienced these types of, you know, painful behaviors towards you, go check out the website.

71:05

There’s a bunch of other resources there too, I think.

71:09

I guess maybe I’m going to put words in your mouth.

71:11

So instead of doing that, let me ask you one final question.

71:15

And that is more about

71:18

The Fellowship side of things over there.

71:20

Like you guys have set up a, I guess, what amounts to a support group for Truthers?

71:28

Yeah, we do.

71:30

I do live streams pretty regularly, and there’s a lot of interaction in that.

71:35

And then we have a Friday night fellowship, which is just like a Truth or Hangout on Friday nights at 7.

71:41

It’s on freeconferencecall.com.

71:44

And, you know, you download the app or you log on and the passwords wake up or else and we just hang out.

71:49

We’re truthers, you know, and you’re alone.

71:52

You got other people to talk to about stuff, you know, what you believe, what you’re going through, whatever.

71:57

And then, you know, right now, that’s what we’re doing.

72:02

Just live streams and the Friday night hangout.

72:05

I mean, that’s awesome, man.

72:07

So, yeah.

72:07

Yeah.

72:09

Okay, so yeah, head over there, guys, and take advantage of Jon’s support system.

72:14

And also see a counselor if you need to.

72:17

It’s important.

72:18

You got to talk shit through.

72:20

Peace and love, guys.

72:21

And we’ll catch you all next time.

72:24

Thanks, Ken.

72:24

Great talking to you, buddy.

72:27

Yeah, thanks.