This is phenomenal and so true. Greetings from the Las Vegas DMV where, like lot of government offices in Phuket also, the people working inside can readily confirm that I am not AI. I laughed a lot at the evolution of a real flesh and blood into the necromancers: "Imagine a would-be influencer, a woman of mediocre talents, but great ambition, or at least greed, and a total lack of moral scruples or shame." This is already happening. My facebook page is inundated with scammers, including a few actual flesh and blood ladies I went to high school with who were trying to recruit me into some CashApp scam. All I needed to get paid was to pay them a processing fee first and/or provide my real flesh and blood ID details to join into the scam! The devolution of facebook is a case in point: at one time I hd a rule that I would only accept friend requests from people who I had met at least once in real life, with a few exceptions made for my half brothers and my employers. Those are the ones I track and actual verifiably human engagement on the site is down at least 60% or so. It's just a newsfeed of AI driven tiny home and nature pictures curated just for me.
My high school senior daughter has a similar rule in her discord private group chats: every new member must be known in person by at least one member of the group. They meet up in person as circumstances allow it such as a bowling excursion I took a few of them to in January.
Here at the DMV, which provides much better people watching opportunities than I have at home while staring at a screen reading this essay and responding to it, device engagement is down. There are some people engaging in conversations who appeaar to be total strangers to each other aand only about 50% or so are staring at screens. I think everything I've done online and off for the last few decades has been to ensure that my story is verifiably real, human and cannot be mimicked...
I'm not really prophesying here - like you, I'm observing the same thing. People are quietly stepping away from the screens and re-engaging with one another. Still early stages, and there's a lot further to go. But it's a very healthy sign.
I have started my own genealogy site for members of my extended family, everyone being of exactly known relation. Part of it is a family tree to explore the relations, part is a forum where people of common descent and those married to them can converse online. It hasn't taken off, but I get a few people commenting on it now and then. I still think it could get popular. Invitation comes from being added to the family tree.
That's a great idea! I still want to explore the hidden Hawkins history but it is a project which has been backburnered for now. My grandmother's side of the family was very proud of their heritage in the US however and we have a family reunion that happens every August in Ohio to this day. It was suspended for a few years during the pandemic of course. I should inquire about that before the tradition gets buried as my great great uncle's daughter is running it but she's getting up there in years herself...
I'd say definitely try to start up that family reunion again. It's worth the work.
Another part of my website project is to make real books out of family trees. I've printed and handed out about eight books so far, with better data and photos as people update the trees on the site. I use a site that lets you upload a PDF and print a book on demand for about $20 paperback and twice that for hardcover.
Future generations who find the books can then also arrange reunions without starting from scratch.
I retired a couple of years ago. That gave me the time.
It also gave me the glorious freedom not to care at all whether I'm employable in California, where most employers are proudly intolerant of any dissent from woke orthodoxy. Example:
Connectivity can be made less bad by having an interface which doesn't have infinite scrolling, pop-up notifications, ads, or suggested clickbait. See https://conntects.net
But whether a paid service can keep the spammers at bay is indeed an interesting question. There WILL be armies of programmers working on the problem. There has been such armies for quite a few years, going back to the Bayesian filters to block spammers.
To some degree, the Internet has been partially dead for decades. Once upon a time, the Internet was an exclusive domain of high IQ humans. USENET was amazing. When HTML was invented, there was till the filter of people smart enough to learn HTML and get hosting set up. Once Blogger and Wordpress democratized the Internet, the density of interesting content dropped. I still pause and look around whenever I come across a site that is clearly hand-crafted.
Google has been battling splogs for a long time. They had to scrap their original algorithm in favor of AI and extra dependence on "official/credible" sites to avoid manipulation. This is one reason why Google's algorithm has become more woke over the past few years. There are just not that many true news sites that aren't woke. An actual news network which actually lived Fox News' "We report; you decide" slogan could sway things back. Instead, we had the once credible yet conservative Forbes open its site to sploggers in an attempt to boost revenues.
Maybe with AI, the sploggers will finally win and paper books and real meetings will become the norm again. But as this war has been going on a long time, I won't write off the online certifiers yet.
In the end, I don't think there's any algorithmic method for keeping out the AI sludge. Every algo developed to detect AI just provides training data for the AIs trying to get around it.
This is why I emphasize personal, direct connections so strongly here. It's really the only way. Those trust networks can in turn be leveraged to provide a curated database of human-generated information. I don't see any other means of doing this.
As to social media, I strongly suspect its days are numbered. Personally, I already spend far more time in GCs than I do in my social media feeds.
Ah! Most social networks -- including the one I linked to -- support that. That won't go away as long as there is out of band verification of the participants.
My point isn't that they're going away, it's that GCs are becoming the main event. Indeed, it's very likely that for a majority of people, they already are.
Thorough, and thoroughly entertaining, John, as always.
My thoughts dovetail with yours re: the inevitable bounce-back into the real, particularly in the realm of art. From my essay, Art vs. AI -- Bring it on, ChatGPT:
"Eventually, though, we will tire of it. We will want something different, and in this case, “different” will mean “real.” We will hunger for words labored over by human beings the way the body craves homemade soup after too many gummy bears.
Even now, we still want to experience human artistry in person. We still go to the theatre, even after movies and dvds and live streaming all said theatre was dead. We still go to sporting events, music concerts, yoga classes, and outdoor festivals.
Last weekend I strolled around Localtopia, one such festival in St. Petersburg, Florida, and took in the infinite variety of human arts on display: kids and adults covering an entire school bus with every paint color imaginable; tote bags made out of leather book covers from the 50s; hula-hoop dancing lessons under shady oaks.
I turned a corner and came upon a curly-headed guy seated at an old typewriter, clacking away.
Seated off to the side was a woman, watching as he occasionally stopped, lifted his eyes to some unseen muse above, then went back at it, pounding those keys with assurance.
The sign draped over his tiny table read,
'Gio's Typos, Poet-for-Hire.
Pick a Topic, Receive a Poem.'
The woman sitting next to Gio waited patiently for her poem, the one he was drawing out of the ether at that moment in time, just for her, on that street filled with the kaleidoscope of humanity.
Behind her was a line of others, chatting while they awaited their turn to hand him $20 and tell him their names, their loves, their griefs, their stories — all unique.
All of those people could have created their own HIGH QUALITY POEMS IN SECONDS, but instead, they chose to wait for the guy who types poetry on a vintage Hermes 3000 typewriter onto the back of old National Geographic maps."
AI may one day replicate human experiences, but it cannot replace human interaction. It may one day be spliced into neurons, but it cannot be intertwined with the human heart. It may one day be able to speak in my voice, but it will never speak from my soul.
"It’s just a rat’s nest of algorithms, coldly and unsympathetically studying your responses in order to refine its model, with the sole objective of manoeuvring you with inhuman patience towards some predefined ideological or commercial goal."
John Carter, I'm guessing you don't have any children. :-)
> dissident right networks are well-placed to leverage their experience to navigate this landscape. This is similar to how the ...
... Communist Party of Yugoslavia was forced to operate underground for a decade before the Axis invasion so was well poised to start it's ultimately successful overthrow of fascist occupiers. xD
About twenty years ago, I got this idea in my head to try to read a book while going for a walk. I probably picked it up from some book or other about olden days, when it wasn't too uncommon a practice.
After a wobbly start, which included consciously training my peripheral vision to recognise the side of curbs and crossing marks and such (as I lived in a city then), I found to my delight that it was trivial, provided the book wasn't some 'Lord of the Rings' in one band kind of thing.
As this was before the proliferation of prtable computers+screens with a telephone-function built into them, I sometimes got comments that it looked weird or goofy. Nothing unkind, just people remarking on an oddity of behaviour. Funnily enough, middle-easterners were the most accepting ones. Perhaps walking and reading is more common in some areas there?
To this day, people who see nothing strange in zombie-ing about with their portable telescreens still raise their eyebrows at a man pulling out a soft-cover novel while walking through town or waiting on the bus. No particular point to this ramble but I have to wonder, is there a cognitive difference in what is happening in the brain? I suspect so, but have no proof. Reading a novel does not give that dopamine kick social media triggers, nor does it work to wreck your attention span or ability to focus; rather the opposite in fact.
Do not, should you try it, do so while riding a bike.
When it comes to AI/robot intelligence, the closest to it is human autistics of the variety with normal to above normal general IQ. Not that they lack conscience or emotion the way a machine does, that's not what I mean - no, what I mean is this:
To many of them, emotions et cetera are just another factor to be consciously weighed regarding the issue at hand. This is what makes them seem cold to others: if emotion(ality) or similar isn't part of the situation, they will not display it or take it into account. Getting them to understand and accept that almost all humans they will ever interact with do not, indeed cannot in the case of many, separate emotion from logic or fact is difficult at best, nigh-on impossible at worst. Sadly, many of the often retreat behind an emotionless shell for lack of ability to handle their own emotional responses.
I think perhaps developers of AI would do well to study human autistics, because the only intelligence we can compare AI to is ourselves and emotion(ality) is part of human intelligence.
When I was in basic training, as we were being marched to the mess one morning, I fell asleep while walking. Just sort of dozed and allowed the sergeant's voice to guide me. Wasn't even using my peripherals in that case.
There's definitely something distinct about using phones vs books ... the former grab your attention in a way books simply don't. It's the interactivity, the full-colour display, the constant notifications, the endless possibilities implied by access to everything in your hand. A book provides none of that.
There's definitely a similarity between autists and AIs. It's no accident that so many programmers are themselves at least mildly autistic. They're trying to replicate cognition as they experience it, following the left-brain methodology - assembly from components - that they naturally understand. Of course that isn't at all how the human mind actually works.
I've yet to read the article you did that directly addresses his writing, but I noticed many parallels from the ones I've encountered so far. I'm reaching the end of The Matter with Things and find his work exceptional... and yours, as well!
The Matter With Things is essential reading, and not only because it makes my wordiness look comparatively restrained. I finished it about a year ago, and it was worth every page. Haven't even come close to digesting it.
I agree wholeheartedly. His insight into the differing views made available by the two hemispheres really tied together my experience with psychedelics, on the one hand, and nearly 20 years of Zen meditation practice, on the other.
Suggesting a use for NFTs? Now I know you're a crypto shill-bot.
In all seriousness, another great essay. I used to wonder what technology my kids would master that I'd have no idea how to use; now I think they'll look upon my smartphone use the same way I look at my parents' Facebook use: a blend of concern and pity. I actually hope that's the case - or, better yet, that I use screens infrequently enough that they don't have to worry!
One of my best friends has some business going involving NFTs. He has explained it to me slowly, in small words, and patiently, multiple times. I still have no idea what they are or why anybody cares about them.
"Say you’re a foreign government, and you want to undermine an adversary."
I'd restate as, 'say you're our domestic government, and you want to undermine your enemies, the peasants.'
===
"Initially, the LLMs are there to agree with their targets, ingratiating themselves by saying interesting things that constructively build on whatever point the target is making in the discourse, supporting them in arguments with others, and so on. Over time, trust is built up. Then, gradually, the shillbot starts trying to vector the target."
This is exactly what the Assassins did, and is the real reason they were able to earn their moniker: trusted servants deciding the way to paradise lay in obedience to the Old Man of the Mountain even if no chance of escape. Perhaps MKUltra is about to be replaced with something much more reliable.
===
If our side ever decides to get political, gaining local and state power in order to control a region, we'll most likely have to go back to a literal whistle-stop campaign, so everyone can see that our leaders are actual people.
You're absolutely correct. There is zero reason to expect that governments wouldn't use this technology domestically, and every reason to expect that they would and, indeed, already are.
A foundational ideology for the belief that AI will eventually equal and surpass human intelligence is that intelligence consists in the manipulation of bits. A deeper and more sinister idea is that reality is fully quantifiable; that only the measurable is real, and that everything real is measurable. Granting these assumptions, a perfect simulation is possible in principle. (Brains-in-vats theories also depend on this.)
Paradise is to be achieved by corralling all of matter into one huge data set. Then we can engineer the social and physical world to perfection by treating it as a giant min-max problem.
We’ve come quite far already. The hideous architecture you speak of accomplishes one thing admirably: It maximizes some quantity (such as floor space) per dollar. More generally, we live in a society of obscene quantitative excess and grievous qualitative poverty.
Such is the result of what I call “the cult of quantity.”
I wonder what you would say about what looks like a contradiction: on the one hand, rejecting that intelligence can be captured through data engineering, yet on the other hand accepting quantitative measures of human intelligence as evidence that some racial groups are more intelligent than others. If true intelligence is embodied, analog, then some aspect of it will escape any quantitative test. Part of the cult of quantity is the veneration of a certain type of intelligence: the ability to manipulate symbols and operate in abstraction. It is systemically recognized and economically rewarded (hedge fund quants do exactly that, manipulate symbols).
I must have missed that essay! You had another, a while ago, that I should have mentioned - the one discussing what happens when the machine gets trained on its own data, and real life gets cut out. Either a contraction to an ideational point, or fuzzing out into static.
On the IQ question, I think that needs to be treated very narrowly. What it measures is real, and those differences between people and peoples are real - but, they're not anything like the whole story. This is why I'm skeptical of any simplistic narrative of supremacy. One person is good at something, but worse at another thing, than another. Who's "better"? It depends on the context. In the end there's really just difference and variety. But that too gets misused ('diversity is our strength' as an excuse to ignore contextually relevant differences in ability), which is a large part of why IQ differences have become such an idee fixe in the discourse.
The veneration of quanta over qualia gets at the heart of the entire issue. To a certain mindset there is only quantity. To another, there is both quantity and quality, but quality is recognized as more important. Right now our society is quite firmly under the spell of the quantitative, which is why it is so aesthetically, spiritually, and emotionally empty. But I think this is in the process of changing - whatever the story of this century will be, it will involve the re-enchantment of the world as well as our re-embodiment within it. I think as that happens, currently fraught issues such as racial differences in IQ will simply become much less interesting to everyone.
"It’s emotionally impossible to place any value on the output of an algorithmic engine."
Yet somehow artists who are known to use autotune and formulaic song writing are still very popular. My counter to your article is that you are projecting yourself onto the normies. No matter how fake and gay the internet gets, a large majority of people will still thinks its great. You can't say this isn't true when you realize that 'Wheel of Fortune' has been on the air for over 40 years.
A few of us will be disconnecting in ever greater ways, but there will be few of us, with much geographic distance between us.
I suspect far more people are already unplugging than we know. Of course, once they withdraw from the public Internet, how do you know they're their? They become effectively invisible.
As to the popularity of manufactured pop music, music has never been less important, culturally. I suspect this is not accidental. For most it's mere background noise.
However, you're correct that not all will pull away. As I said near the end - many will be lost.
Aha, now I'm pretty sure you're not an AI. Surely only a human brain would make this mistake (one of a type that I have made myself, even though I'm a stickler on spelling and grammar, and relatively intelligent).
I nearly always catch that kind of typo in the work of others, but it's very hard to catch it in one's own writing. In this case, spell-check wouldn't catch it, either. I like that you are leaving it - maybe we all should care a little less about our typo faux pas in order to prove our humanity?
Yes, we so often see what we expect to see, no matter how many times we scan past it. The there/they're/their mix up is quite common, it seems. Stupid people make the mistake because, well, they're stupid. Smart people make it because our brain is a diabolical demon sometimes. :)
Unless JC-AI is programmed to make occasional errors to get us to believe he is real. Like calligraphy programs that are designed to not be perfect, as perfection makes it look obviously fake.
Anecdotally speaking, I don't know very many people who still use social media or read online articles besides people like us in private chats. I work with young, normal, healthy people and most of them never even pull out cellphones compared to a decade ago. Shit, back in 2009 everyone I met had a facebook or Twitter, now I seem to be the only one who does anything online.
Normal and healthy young people are very rare these days ;)
I work with a mid-20s male and female, and some of the things I learn of their generation are both terrifying and saddening. Neither one knows anything of mainstream music, they both listen to groups that few outside of their genre have heard of. And they aren't hipsters, that just seems to be what some of the kids these days are gravitating towards. I'm slightly older than both of them combined, yet all 3 of us have never heard a Taylor Swift song to the end.
Most of their mass online experience seems to be in the realm of computer gaming, and social media with people they know IRL. At least as far as an old fogey like me can tell. Sometimes when it's just the 2 of them talking it's like a twin language, and I'm not sure what they are saying.
On the point about the importance of asking questions, and having worked in the education field, I concur. It has been quite a while since I concluded that the lack of knowledge or education among the general public is not due to a lack of resources or lack of opportunity, but a lack of curiosity. The problem is not that they don't know but that they don't WANT to know.
Curiosity is, I'm afraid, a very rare characteristic exhibited by very few people. For most, they want enough education or knowledge to be able to do their jobs and operate a dishwasher. Beyond that, they may as well be bots themselves. They are completely empty.
Curiosity gets beaten out of kids quite young, and I suspect the school system has something to do with that. There's also probably a degree of natural maturation. Curiosity requires a certain neurological neoteny - a child's mind, one that still asks questions, particularly about things it *thinks* it knows. Few maintain this. The decline in neuroplasticity with age is implicated here, although I also know quite a few people who manage to maintain this.
A large part of that is dependent on the parents, and their patience. I go out of my way to nurture my daughters questions, especially when she’s asking why. But it can be EXHAUSTING. Worth it, but easy for people who are already tired or mentally drained to start snapping at their kids over.
I’ll add my (conditional) concurrence with this thought: why are the parents like that? (And acknowledge this is partly a chicken-egg phenomenon). I’m sure parents could crush the curiosity in kids before the schools get them, HOWEVER… the schools could also (re)-ignite that spark. They don’t. They extinguish it - and IMO - are so designed. I would add a link to my own (multiple) rants on my stack but it would be unseemly to John.
It's certainly a chicken/egg question. Parents that are intelligent and curious are more likely to have intelligent and curious children, and to foster this in them. Is upbringing or genetics more important? Hard to say.
The less the parents encourage and enable it, the more important the genetics, and vice versa.
That, and it's really not either/or - genetics simply puts a hard cap on "max level" so to speak, plus weights the odds of someone doing something, or not.
I'm not sure if it's curiousity that gets beaten out of kids or if they're simply trained to tolerate boredom. Based on my own observations and experience with children, it seems that those who have easy access to entertainment (TV, video games, so-called "smart" phones, etc.) will have no incentive to explore and create. It is an aversion to boredom that seems to drive children to read or draw, to play make-believe and to explore the larger world.
Not so much an aversion to boredom - television etc also help to alleviate it - as the habit of filling one's boredom oneself. There is indeed a connection between free play and curiosity.
Yes, there is a connection which is why free play should be encouraged. That's one of the reasons why there were no TVs or video games in my home when my children were growing up. When they complained about being bored, I'd hand them a book, give them paper and paints, send them outside to play (or to clean out the barn).
Schools, and it doesn't matter if we're in Athens 2 000 years ago or in Bismarck's Prussia, do indeed curb curiosity by necessity, if not by planning. There's just so many hours the teacher has to work with, so there's a limit to the time that can be allotted to students being curious in class.
No conspiracy or plan needed. Consider this, which is from my own career:
I have 33 students, age 17-18. I meet them for 90 minutes per week, class being political science in practice with examples from Sweden's three levels of administration. Deduct time for class coming to order, refreshing from last time, information on homework for next wee, et cetera. At best, I now have 75 minutes left.
And the students have the right to a 5 minute breather half way through. And coming to order takes 5 minutes. We now have 65 minutes effective time left - divide that by 33 and that's how much time I can give each student. And every minute spent that way eats up what I had planned for the lesson.
And I had ten minutes between classes to scramble to the office, switch papers, and scramble to next class. Who needs conspiracies curbing curiosity? Logistics are enough.
The other reason is, being curious about the wrong thing in the wrong way means you lose status and position; therefore, only be curious about what the group-mind approves of. Which is all perfectly natural for us, and is the main reason all of us nitpicking and scratching at the eternal whys and wheretofores of life, the universe and everything are always slightly oddball, personality-wise.
The idea of a conscious conspiracy is certainly attractive, since my above "Hecture" is just too bleak to be palatable - indeed, I'd prefer the conspiracy! Alas and alack, such is not the case.
You've got the right of it. It's inherent to the one-to-many broadcast model of lecture pedagogy. Probably not accidental that the sciences got their start amongst the petty nobility, who could afford private tutors for their offspring. Extended one-on-one pedagogy leaves much more for open-ended curiosity.
The other problem is that the classroom has expanded to occupy too much time, as any time not spent studying is deemed wasted. But curiosity is best fostered not by pestering adults with questions, but by the child learning to seek out answers on their own. This happens best in open ended play, either alone, or with small groups of peers.
They can pass all the laws they want, but stopping people from growing food, *especially* in permaculture style which is often indistinguishable from forest or meadow from 10,000 feet overhead, is a pipe dream they'll never achieve.
IRL they've never been much good at enforcing the law outside of their panopticon cities, and pretty bad within them too. Stopping people from growing cucumbers on the balconies of their CAFO-style shoebox apartments will be a great tragedy to urban foodie hipsters I'm sure, but the idea that they're going to stop people out in the sticks from scattering seeds in a meadow is laughable.
I suspect even low-flying drones would have a tough time analyzing the makeup of a permaculture meadow or food forest and determining there's an unusual amount of edible plants growing in it. Unless we're talking really-low altitude, in which case, screw ECM, just shoot the thing down. You can police some of the people some of the time, but you can't police all the people all the time. Especially not out in the middle of nowhere.
With the proper illustration, this could be a very successful coffee mug. A good phrase, worth developing and re-using.
If showing a buff physique is a prerequisite to being invited into dissident right clandestine circles, I am, to use the traditional term, shit out of luck. Old, forgetful and flabby is no way to go through life, son, what's left of it! I will carry on as a solo performer, shunned by the beefcake yet brainy young men who will build and operate the future. You guys will do just fine, I'm sure.
Last fall I reread The Past Through Tomorrow: Future History Stories. Still brilliant, a little depressing, since that is the future we should have had. On the other hand, Heinlein predicted The Crazy Years, where everything would be a mess and there would be an interregnum in space travel and technological regression. So, he knew it was an up-and-down process.
We are 100% in the Crazy Years, and while it does seem like the woke fever is starting to break, I suspect we haven't seen anything yet.
For example: consider what happens to the culture when everyone withdraws not only from legacy broadcast media, but from algorithmically driven social media, into little subcultures scattered through an archipelago of group chats. Ontological fragmentation into a zillion bizarre little cults is my guess.
So, I suppose the entire described scenario will play out a second time when the humanoid robots come out and can infiltrate your real-life friend group and attend the group chat meet ups
Great article and the one thing I took from it is hope. Hope that the mindless screen gazing I see going on around me will be replaced with human interaction again.
On this mild spring day with the window open, a cool breeze coming in, my mind drifts to thoughts of going outside and leaving the work on my computer for another day.
This is phenomenal and so true. Greetings from the Las Vegas DMV where, like lot of government offices in Phuket also, the people working inside can readily confirm that I am not AI. I laughed a lot at the evolution of a real flesh and blood into the necromancers: "Imagine a would-be influencer, a woman of mediocre talents, but great ambition, or at least greed, and a total lack of moral scruples or shame." This is already happening. My facebook page is inundated with scammers, including a few actual flesh and blood ladies I went to high school with who were trying to recruit me into some CashApp scam. All I needed to get paid was to pay them a processing fee first and/or provide my real flesh and blood ID details to join into the scam! The devolution of facebook is a case in point: at one time I hd a rule that I would only accept friend requests from people who I had met at least once in real life, with a few exceptions made for my half brothers and my employers. Those are the ones I track and actual verifiably human engagement on the site is down at least 60% or so. It's just a newsfeed of AI driven tiny home and nature pictures curated just for me.
My high school senior daughter has a similar rule in her discord private group chats: every new member must be known in person by at least one member of the group. They meet up in person as circumstances allow it such as a bowling excursion I took a few of them to in January.
Here at the DMV, which provides much better people watching opportunities than I have at home while staring at a screen reading this essay and responding to it, device engagement is down. There are some people engaging in conversations who appeaar to be total strangers to each other aand only about 50% or so are staring at screens. I think everything I've done online and off for the last few decades has been to ensure that my story is verifiably real, human and cannot be mimicked...
Excellent comment. Pinned.
I'm not really prophesying here - like you, I'm observing the same thing. People are quietly stepping away from the screens and re-engaging with one another. Still early stages, and there's a lot further to go. But it's a very healthy sign.
I have started my own genealogy site for members of my extended family, everyone being of exactly known relation. Part of it is a family tree to explore the relations, part is a forum where people of common descent and those married to them can converse online. It hasn't taken off, but I get a few people commenting on it now and then. I still think it could get popular. Invitation comes from being added to the family tree.
I love this idea.
That's a great idea! I still want to explore the hidden Hawkins history but it is a project which has been backburnered for now. My grandmother's side of the family was very proud of their heritage in the US however and we have a family reunion that happens every August in Ohio to this day. It was suspended for a few years during the pandemic of course. I should inquire about that before the tradition gets buried as my great great uncle's daughter is running it but she's getting up there in years herself...
I'd say definitely try to start up that family reunion again. It's worth the work.
Another part of my website project is to make real books out of family trees. I've printed and handed out about eight books so far, with better data and photos as people update the trees on the site. I use a site that lets you upload a PDF and print a book on demand for about $20 paperback and twice that for hardcover.
Future generations who find the books can then also arrange reunions without starting from scratch.
Where do you find the time Patrick?!...:)
You've got your hand in so many things.
I retired a couple of years ago. That gave me the time.
It also gave me the glorious freedom not to care at all whether I'm employable in California, where most employers are proudly intolerant of any dissent from woke orthodoxy. Example:
https://patrick.net/post/1379912/2023-08-07-record-of-my-having-been-denied-a-job
Connectivity can be made less bad by having an interface which doesn't have infinite scrolling, pop-up notifications, ads, or suggested clickbait. See https://conntects.net
But whether a paid service can keep the spammers at bay is indeed an interesting question. There WILL be armies of programmers working on the problem. There has been such armies for quite a few years, going back to the Bayesian filters to block spammers.
To some degree, the Internet has been partially dead for decades. Once upon a time, the Internet was an exclusive domain of high IQ humans. USENET was amazing. When HTML was invented, there was till the filter of people smart enough to learn HTML and get hosting set up. Once Blogger and Wordpress democratized the Internet, the density of interesting content dropped. I still pause and look around whenever I come across a site that is clearly hand-crafted.
Google has been battling splogs for a long time. They had to scrap their original algorithm in favor of AI and extra dependence on "official/credible" sites to avoid manipulation. This is one reason why Google's algorithm has become more woke over the past few years. There are just not that many true news sites that aren't woke. An actual news network which actually lived Fox News' "We report; you decide" slogan could sway things back. Instead, we had the once credible yet conservative Forbes open its site to sploggers in an attempt to boost revenues.
Maybe with AI, the sploggers will finally win and paper books and real meetings will become the norm again. But as this war has been going on a long time, I won't write off the online certifiers yet.
In the end, I don't think there's any algorithmic method for keeping out the AI sludge. Every algo developed to detect AI just provides training data for the AIs trying to get around it.
This is why I emphasize personal, direct connections so strongly here. It's really the only way. Those trust networks can in turn be leveraged to provide a curated database of human-generated information. I don't see any other means of doing this.
As to social media, I strongly suspect its days are numbered. Personally, I already spend far more time in GCs than I do in my social media feeds.
What is a GC?
Group chat
Ah! Most social networks -- including the one I linked to -- support that. That won't go away as long as there is out of band verification of the participants.
My point isn't that they're going away, it's that GCs are becoming the main event. Indeed, it's very likely that for a majority of people, they already are.
Thorough, and thoroughly entertaining, John, as always.
My thoughts dovetail with yours re: the inevitable bounce-back into the real, particularly in the realm of art. From my essay, Art vs. AI -- Bring it on, ChatGPT:
"Eventually, though, we will tire of it. We will want something different, and in this case, “different” will mean “real.” We will hunger for words labored over by human beings the way the body craves homemade soup after too many gummy bears.
Even now, we still want to experience human artistry in person. We still go to the theatre, even after movies and dvds and live streaming all said theatre was dead. We still go to sporting events, music concerts, yoga classes, and outdoor festivals.
Last weekend I strolled around Localtopia, one such festival in St. Petersburg, Florida, and took in the infinite variety of human arts on display: kids and adults covering an entire school bus with every paint color imaginable; tote bags made out of leather book covers from the 50s; hula-hoop dancing lessons under shady oaks.
I turned a corner and came upon a curly-headed guy seated at an old typewriter, clacking away.
Seated off to the side was a woman, watching as he occasionally stopped, lifted his eyes to some unseen muse above, then went back at it, pounding those keys with assurance.
The sign draped over his tiny table read,
'Gio's Typos, Poet-for-Hire.
Pick a Topic, Receive a Poem.'
The woman sitting next to Gio waited patiently for her poem, the one he was drawing out of the ether at that moment in time, just for her, on that street filled with the kaleidoscope of humanity.
Behind her was a line of others, chatting while they awaited their turn to hand him $20 and tell him their names, their loves, their griefs, their stories — all unique.
All of those people could have created their own HIGH QUALITY POEMS IN SECONDS, but instead, they chose to wait for the guy who types poetry on a vintage Hermes 3000 typewriter onto the back of old National Geographic maps."
AI may one day replicate human experiences, but it cannot replace human interaction. It may one day be spliced into neurons, but it cannot be intertwined with the human heart. It may one day be able to speak in my voice, but it will never speak from my soul.
https://marypoindextermclaughlin.substack.com/p/art-vs-ai
What a wonderful story. Thank you for sharing that!
Great comment. Funny I might have seen you in St. Petersburg the same weekend...doing exactly the same thing!
That's nifty. Do you live in the area?
Yes. Sarasota.
Do you live in the free country of Florida?
Indeed I do! North of Tampa. Moved from the disturbingly less-free country of NY
Ha. Yeah we moved from CA to Florida in May of 20' to escape totalitarianism
What made you choose Sarasota, if you don't mind my asking?
"It’s just a rat’s nest of algorithms, coldly and unsympathetically studying your responses in order to refine its model, with the sole objective of manoeuvring you with inhuman patience towards some predefined ideological or commercial goal."
John Carter, I'm guessing you don't have any children. :-)
Ha!
We interrupt the reading to bring you the following comment:
> But a human is required to set that chain in motion in the first place.
That awkward moment when you're ONE STEP AWAY from the old Medieval proofs for God. xD
We're trying to make gods of machines, not realizing that we are the gods of the machines.
Another interruption. xD
> dissident right networks are well-placed to leverage their experience to navigate this landscape. This is similar to how the ...
... Communist Party of Yugoslavia was forced to operate underground for a decade before the Axis invasion so was well poised to start it's ultimately successful overthrow of fascist occupiers. xD
Hey, we all have our heroes. xD
About twenty years ago, I got this idea in my head to try to read a book while going for a walk. I probably picked it up from some book or other about olden days, when it wasn't too uncommon a practice.
After a wobbly start, which included consciously training my peripheral vision to recognise the side of curbs and crossing marks and such (as I lived in a city then), I found to my delight that it was trivial, provided the book wasn't some 'Lord of the Rings' in one band kind of thing.
As this was before the proliferation of prtable computers+screens with a telephone-function built into them, I sometimes got comments that it looked weird or goofy. Nothing unkind, just people remarking on an oddity of behaviour. Funnily enough, middle-easterners were the most accepting ones. Perhaps walking and reading is more common in some areas there?
To this day, people who see nothing strange in zombie-ing about with their portable telescreens still raise their eyebrows at a man pulling out a soft-cover novel while walking through town or waiting on the bus. No particular point to this ramble but I have to wonder, is there a cognitive difference in what is happening in the brain? I suspect so, but have no proof. Reading a novel does not give that dopamine kick social media triggers, nor does it work to wreck your attention span or ability to focus; rather the opposite in fact.
Do not, should you try it, do so while riding a bike.
When it comes to AI/robot intelligence, the closest to it is human autistics of the variety with normal to above normal general IQ. Not that they lack conscience or emotion the way a machine does, that's not what I mean - no, what I mean is this:
To many of them, emotions et cetera are just another factor to be consciously weighed regarding the issue at hand. This is what makes them seem cold to others: if emotion(ality) or similar isn't part of the situation, they will not display it or take it into account. Getting them to understand and accept that almost all humans they will ever interact with do not, indeed cannot in the case of many, separate emotion from logic or fact is difficult at best, nigh-on impossible at worst. Sadly, many of the often retreat behind an emotionless shell for lack of ability to handle their own emotional responses.
I think perhaps developers of AI would do well to study human autistics, because the only intelligence we can compare AI to is ourselves and emotion(ality) is part of human intelligence.
When I was in basic training, as we were being marched to the mess one morning, I fell asleep while walking. Just sort of dozed and allowed the sergeant's voice to guide me. Wasn't even using my peripherals in that case.
There's definitely something distinct about using phones vs books ... the former grab your attention in a way books simply don't. It's the interactivity, the full-colour display, the constant notifications, the endless possibilities implied by access to everything in your hand. A book provides none of that.
There's definitely a similarity between autists and AIs. It's no accident that so many programmers are themselves at least mildly autistic. They're trying to replicate cognition as they experience it, following the left-brain methodology - assembly from components - that they naturally understand. Of course that isn't at all how the human mind actually works.
Someone's been reading Iain McGilchrist...
Not sure how much of my work you've read, but: yes, and I talk about him a LOT.
I've yet to read the article you did that directly addresses his writing, but I noticed many parallels from the ones I've encountered so far. I'm reaching the end of The Matter with Things and find his work exceptional... and yours, as well!
The Matter With Things is essential reading, and not only because it makes my wordiness look comparatively restrained. I finished it about a year ago, and it was worth every page. Haven't even come close to digesting it.
I agree wholeheartedly. His insight into the differing views made available by the two hemispheres really tied together my experience with psychedelics, on the one hand, and nearly 20 years of Zen meditation practice, on the other.
Suggesting a use for NFTs? Now I know you're a crypto shill-bot.
In all seriousness, another great essay. I used to wonder what technology my kids would master that I'd have no idea how to use; now I think they'll look upon my smartphone use the same way I look at my parents' Facebook use: a blend of concern and pity. I actually hope that's the case - or, better yet, that I use screens infrequently enough that they don't have to worry!
Within a generation smartphones will go the way of pocket watches.
Tbh I still don't understand. NFTs.
One of my best friends has some business going involving NFTs. He has explained it to me slowly, in small words, and patiently, multiple times. I still have no idea what they are or why anybody cares about them.
At a certain point one begins to suspect that it is not one's ability to understand that is the problem.
If NFTs somehow end up mattering it'll confirm I'm a midwit - not close enough to either end of the bell curve to have figured it out.
Classic
Our bodies are decaying. Our food is poison.
Until we fix this, nothing else matters. We are first and foremost physical creatures.
It's extremely important, I agree.
True. We will all be consistent when we are dead.
We exist between nothing, for a brief time, to make life have meaning during our journey to death.
The irony.
"Say you’re a foreign government, and you want to undermine an adversary."
I'd restate as, 'say you're our domestic government, and you want to undermine your enemies, the peasants.'
===
"Initially, the LLMs are there to agree with their targets, ingratiating themselves by saying interesting things that constructively build on whatever point the target is making in the discourse, supporting them in arguments with others, and so on. Over time, trust is built up. Then, gradually, the shillbot starts trying to vector the target."
This is exactly what the Assassins did, and is the real reason they were able to earn their moniker: trusted servants deciding the way to paradise lay in obedience to the Old Man of the Mountain even if no chance of escape. Perhaps MKUltra is about to be replaced with something much more reliable.
===
If our side ever decides to get political, gaining local and state power in order to control a region, we'll most likely have to go back to a literal whistle-stop campaign, so everyone can see that our leaders are actual people.
You're absolutely correct. There is zero reason to expect that governments wouldn't use this technology domestically, and every reason to expect that they would and, indeed, already are.
Terrific essay!
A foundational ideology for the belief that AI will eventually equal and surpass human intelligence is that intelligence consists in the manipulation of bits. A deeper and more sinister idea is that reality is fully quantifiable; that only the measurable is real, and that everything real is measurable. Granting these assumptions, a perfect simulation is possible in principle. (Brains-in-vats theories also depend on this.)
Paradise is to be achieved by corralling all of matter into one huge data set. Then we can engineer the social and physical world to perfection by treating it as a giant min-max problem.
We’ve come quite far already. The hideous architecture you speak of accomplishes one thing admirably: It maximizes some quantity (such as floor space) per dollar. More generally, we live in a society of obscene quantitative excess and grievous qualitative poverty.
Such is the result of what I call “the cult of quantity.”
I treated many of the ideas in this essay (though with less acumen and verve) in a few articles in the past year. Most recently this one: https://charleseisenstein.substack.com/p/machines-will-not-replace-us.
I write on LibreOffice too.
I wonder what you would say about what looks like a contradiction: on the one hand, rejecting that intelligence can be captured through data engineering, yet on the other hand accepting quantitative measures of human intelligence as evidence that some racial groups are more intelligent than others. If true intelligence is embodied, analog, then some aspect of it will escape any quantitative test. Part of the cult of quantity is the veneration of a certain type of intelligence: the ability to manipulate symbols and operate in abstraction. It is systemically recognized and economically rewarded (hedge fund quants do exactly that, manipulate symbols).
I must have missed that essay! You had another, a while ago, that I should have mentioned - the one discussing what happens when the machine gets trained on its own data, and real life gets cut out. Either a contraction to an ideational point, or fuzzing out into static.
On the IQ question, I think that needs to be treated very narrowly. What it measures is real, and those differences between people and peoples are real - but, they're not anything like the whole story. This is why I'm skeptical of any simplistic narrative of supremacy. One person is good at something, but worse at another thing, than another. Who's "better"? It depends on the context. In the end there's really just difference and variety. But that too gets misused ('diversity is our strength' as an excuse to ignore contextually relevant differences in ability), which is a large part of why IQ differences have become such an idee fixe in the discourse.
The veneration of quanta over qualia gets at the heart of the entire issue. To a certain mindset there is only quantity. To another, there is both quantity and quality, but quality is recognized as more important. Right now our society is quite firmly under the spell of the quantitative, which is why it is so aesthetically, spiritually, and emotionally empty. But I think this is in the process of changing - whatever the story of this century will be, it will involve the re-enchantment of the world as well as our re-embodiment within it. I think as that happens, currently fraught issues such as racial differences in IQ will simply become much less interesting to everyone.
"It’s emotionally impossible to place any value on the output of an algorithmic engine."
Yet somehow artists who are known to use autotune and formulaic song writing are still very popular. My counter to your article is that you are projecting yourself onto the normies. No matter how fake and gay the internet gets, a large majority of people will still thinks its great. You can't say this isn't true when you realize that 'Wheel of Fortune' has been on the air for over 40 years.
A few of us will be disconnecting in ever greater ways, but there will be few of us, with much geographic distance between us.
I suspect far more people are already unplugging than we know. Of course, once they withdraw from the public Internet, how do you know they're their? They become effectively invisible.
As to the popularity of manufactured pop music, music has never been less important, culturally. I suspect this is not accidental. For most it's mere background noise.
However, you're correct that not all will pull away. As I said near the end - many will be lost.
"how do you know they're their?"
Aha, now I'm pretty sure you're not an AI. Surely only a human brain would make this mistake (one of a type that I have made myself, even though I'm a stickler on spelling and grammar, and relatively intelligent).
Shit! You found it! The first typo!
It is not only our vulgarity that proves our humanity, but our retardery.
I'm leaving that in, just to keep things human.
I nearly always catch that kind of typo in the work of others, but it's very hard to catch it in one's own writing. In this case, spell-check wouldn't catch it, either. I like that you are leaving it - maybe we all should care a little less about our typo faux pas in order to prove our humanity?
I generally care to an obsessive degree about the grammatical and typographical integrity of the text. It's professional pride.
But there's always at least one that sneaks through...
Yes, we so often see what we expect to see, no matter how many times we scan past it. The there/they're/their mix up is quite common, it seems. Stupid people make the mistake because, well, they're stupid. Smart people make it because our brain is a diabolical demon sometimes. :)
Unless JC-AI is programmed to make occasional errors to get us to believe he is real. Like calligraphy programs that are designed to not be perfect, as perfection makes it look obviously fake.
Yes, that thought has crossed my mind.
Anecdotally speaking, I don't know very many people who still use social media or read online articles besides people like us in private chats. I work with young, normal, healthy people and most of them never even pull out cellphones compared to a decade ago. Shit, back in 2009 everyone I met had a facebook or Twitter, now I seem to be the only one who does anything online.
" young, normal, healthy people"
Normal and healthy young people are very rare these days ;)
I work with a mid-20s male and female, and some of the things I learn of their generation are both terrifying and saddening. Neither one knows anything of mainstream music, they both listen to groups that few outside of their genre have heard of. And they aren't hipsters, that just seems to be what some of the kids these days are gravitating towards. I'm slightly older than both of them combined, yet all 3 of us have never heard a Taylor Swift song to the end.
Most of their mass online experience seems to be in the realm of computer gaming, and social media with people they know IRL. At least as far as an old fogey like me can tell. Sometimes when it's just the 2 of them talking it's like a twin language, and I'm not sure what they are saying.
I've been noticing the same thing. People are developing immunity to the digital crack.
If it turns out to be only the Remnant, Albert Jay Nock has assured us that is enough. (“Isaiah’s Job”)
On the point about the importance of asking questions, and having worked in the education field, I concur. It has been quite a while since I concluded that the lack of knowledge or education among the general public is not due to a lack of resources or lack of opportunity, but a lack of curiosity. The problem is not that they don't know but that they don't WANT to know.
Curiosity is, I'm afraid, a very rare characteristic exhibited by very few people. For most, they want enough education or knowledge to be able to do their jobs and operate a dishwasher. Beyond that, they may as well be bots themselves. They are completely empty.
Curiosity gets beaten out of kids quite young, and I suspect the school system has something to do with that. There's also probably a degree of natural maturation. Curiosity requires a certain neurological neoteny - a child's mind, one that still asks questions, particularly about things it *thinks* it knows. Few maintain this. The decline in neuroplasticity with age is implicated here, although I also know quite a few people who manage to maintain this.
A large part of that is dependent on the parents, and their patience. I go out of my way to nurture my daughters questions, especially when she’s asking why. But it can be EXHAUSTING. Worth it, but easy for people who are already tired or mentally drained to start snapping at their kids over.
This is also quite true.
I’ll add my (conditional) concurrence with this thought: why are the parents like that? (And acknowledge this is partly a chicken-egg phenomenon). I’m sure parents could crush the curiosity in kids before the schools get them, HOWEVER… the schools could also (re)-ignite that spark. They don’t. They extinguish it - and IMO - are so designed. I would add a link to my own (multiple) rants on my stack but it would be unseemly to John.
It's certainly a chicken/egg question. Parents that are intelligent and curious are more likely to have intelligent and curious children, and to foster this in them. Is upbringing or genetics more important? Hard to say.
To that I’ll simply say that I have never met a child under 5 who didn’t WANT to understand.
The less the parents encourage and enable it, the more important the genetics, and vice versa.
That, and it's really not either/or - genetics simply puts a hard cap on "max level" so to speak, plus weights the odds of someone doing something, or not.
I'm not sure if it's curiousity that gets beaten out of kids or if they're simply trained to tolerate boredom. Based on my own observations and experience with children, it seems that those who have easy access to entertainment (TV, video games, so-called "smart" phones, etc.) will have no incentive to explore and create. It is an aversion to boredom that seems to drive children to read or draw, to play make-believe and to explore the larger world.
Not so much an aversion to boredom - television etc also help to alleviate it - as the habit of filling one's boredom oneself. There is indeed a connection between free play and curiosity.
Yes, there is a connection which is why free play should be encouraged. That's one of the reasons why there were no TVs or video games in my home when my children were growing up. When they complained about being bored, I'd hand them a book, give them paper and paints, send them outside to play (or to clean out the barn).
It's double-whammy:
Schools, and it doesn't matter if we're in Athens 2 000 years ago or in Bismarck's Prussia, do indeed curb curiosity by necessity, if not by planning. There's just so many hours the teacher has to work with, so there's a limit to the time that can be allotted to students being curious in class.
No conspiracy or plan needed. Consider this, which is from my own career:
I have 33 students, age 17-18. I meet them for 90 minutes per week, class being political science in practice with examples from Sweden's three levels of administration. Deduct time for class coming to order, refreshing from last time, information on homework for next wee, et cetera. At best, I now have 75 minutes left.
And the students have the right to a 5 minute breather half way through. And coming to order takes 5 minutes. We now have 65 minutes effective time left - divide that by 33 and that's how much time I can give each student. And every minute spent that way eats up what I had planned for the lesson.
And I had ten minutes between classes to scramble to the office, switch papers, and scramble to next class. Who needs conspiracies curbing curiosity? Logistics are enough.
The other reason is, being curious about the wrong thing in the wrong way means you lose status and position; therefore, only be curious about what the group-mind approves of. Which is all perfectly natural for us, and is the main reason all of us nitpicking and scratching at the eternal whys and wheretofores of life, the universe and everything are always slightly oddball, personality-wise.
The idea of a conscious conspiracy is certainly attractive, since my above "Hecture" is just too bleak to be palatable - indeed, I'd prefer the conspiracy! Alas and alack, such is not the case.
You've got the right of it. It's inherent to the one-to-many broadcast model of lecture pedagogy. Probably not accidental that the sciences got their start amongst the petty nobility, who could afford private tutors for their offspring. Extended one-on-one pedagogy leaves much more for open-ended curiosity.
The other problem is that the classroom has expanded to occupy too much time, as any time not spent studying is deemed wasted. But curiosity is best fostered not by pestering adults with questions, but by the child learning to seek out answers on their own. This happens best in open ended play, either alone, or with small groups of peers.
I've found six students to one teacher to be the ideal number, with three as the lower end and nine as the higher.
Unsurprisingly, our military trains people to work in combat groups (8-12) and in combat pairs.
Comparing my dad's books on education - him having trained soldiers for 45ish years - to mine from the Teacher's College just makes me want to weep.
It's like comparing the tale of Beren and Luthien with something by Claudine Gay.
It has *everything* to do with that.
“Permaculture farming stands out as an obvious possibility.”
It would be, except for the fact that governments are trying to make farming illegal, and the skies are being sprayed daily.
It all feels so hopeless sometimes. They (whoever “they” are) are literally blocking the sunlight from reaching the earth.
And it’s hard to talk about it; to most people it sounds insane.
And if we can’t even acknowledge that it’s happening, how can we do anything about it?
Luckily, governments aren't the only ones who get a vote.
Chemtrail type stuff is discussed more openly every day. The regime's control only goes so far, and is breaking down as we speak.
They can pass all the laws they want, but stopping people from growing food, *especially* in permaculture style which is often indistinguishable from forest or meadow from 10,000 feet overhead, is a pipe dream they'll never achieve.
IRL they've never been much good at enforcing the law outside of their panopticon cities, and pretty bad within them too. Stopping people from growing cucumbers on the balconies of their CAFO-style shoebox apartments will be a great tragedy to urban foodie hipsters I'm sure, but the idea that they're going to stop people out in the sticks from scattering seeds in a meadow is laughable.
I suspect their plan is basically: drones.
However, if it comes to that, ECW is a thing.
I suspect even low-flying drones would have a tough time analyzing the makeup of a permaculture meadow or food forest and determining there's an unusual amount of edible plants growing in it. Unless we're talking really-low altitude, in which case, screw ECM, just shoot the thing down. You can police some of the people some of the time, but you can't police all the people all the time. Especially not out in the middle of nowhere.
And especially not when the people have no interest in being policed.
Good post.
"... a soft Butlerian Jihad ..."
With the proper illustration, this could be a very successful coffee mug. A good phrase, worth developing and re-using.
If showing a buff physique is a prerequisite to being invited into dissident right clandestine circles, I am, to use the traditional term, shit out of luck. Old, forgetful and flabby is no way to go through life, son, what's left of it! I will carry on as a solo performer, shunned by the beefcake yet brainy young men who will build and operate the future. You guys will do just fine, I'm sure.
Generally allowances are made for age. The key thing is that one is not neglectful of one's embodied self.
Noted.
I'll see what I can do to firm up for the day my application to the Brotherhood is under review.
BTW if you have not read, or not recently, Robert A. Heinlein's novella If This Goes On ..., it's still a classic about underground resistance.
I'm pretty sure I've read that, waaaay back in the day. Read all the Heinlein I could get my hands on.
Last fall I reread The Past Through Tomorrow: Future History Stories. Still brilliant, a little depressing, since that is the future we should have had. On the other hand, Heinlein predicted The Crazy Years, where everything would be a mess and there would be an interregnum in space travel and technological regression. So, he knew it was an up-and-down process.
We are 100% in the Crazy Years, and while it does seem like the woke fever is starting to break, I suspect we haven't seen anything yet.
For example: consider what happens to the culture when everyone withdraws not only from legacy broadcast media, but from algorithmically driven social media, into little subcultures scattered through an archipelago of group chats. Ontological fragmentation into a zillion bizarre little cults is my guess.
So, I suppose the entire described scenario will play out a second time when the humanoid robots come out and can infiltrate your real-life friend group and attend the group chat meet ups
That's what NeuraLink is for.
Great article and the one thing I took from it is hope. Hope that the mindless screen gazing I see going on around me will be replaced with human interaction again.
On this mild spring day with the window open, a cool breeze coming in, my mind drifts to thoughts of going outside and leaving the work on my computer for another day.
It really is awful being chained to screens all the time.
I say, as I sit in a cafe, typing away on my screen.
Ah, the irony.
Think I'm going to go for a walk.