The World Belongs to Thieves

Parts 1 and 2

It has taken me a bit of time to think this one through as I’ve rewritten it after my original draft led my thinking in a different direction. The core of my thoughts still remain the same, but I’m still deciding if this will be a detriment for society.

Thus, I will lay it out here and let time figure it out.

There should be no surprise to anyone familiar with the history of Silicon Valley that AI generative models are built on mining (stealing) the data (or work) of everyone else. After all, this is the same place where Apple stole from Xerox, Microsoft stole from Apple and everyone else just followed suit at stealing from each other.

Good artists copy. Great artists steal.

This is just as true for the tech industry as it is for the magic world I spoke about in the previous post. However, at some point, you can only steal so much before it’s time to innovate. The major corporations make it their business to buy up startups (which is a business model in itself) instead of doing anything creative on their own.

The only other step is corporate espionage, which is a sore spot for me as a Canadian considering what happened to our major tech company and biology lab

Which leaves us with the question of what our world values more: creative thinking or strategic domination.

The problem with the latter is we’ve equated it with the former. Originally, these trio of posts were about the death of imagination thanks to the advent of generative AI models. While it’s overly simplistic to attack the latest piece of technology as the end of humanity, I think it’s fair to say we now have a piece of technology that seems to be the natural end of what we value as a society.

Why bother putting any cognitive load on the brain when you have a piece of technology that will steal ideas from everyone?

Before someone accuses me of being a tech alarmist, I understand we’re not all trying to be creative artists or innovative thinkers. For menial tasks that need to get done, advances in technology have always been a help and I’m sure some would argue that we can make even greater leaps thanks to freeing up our cognitive load. In fact, we can argue that we can now put things together in ways we’ve never been able to before.

But if this is where we’re heading, is there any intrinsic value to “thinking outside the box?” If all the innovative thinking that we do is just an amalgamation of ideas by others, why bother encouraging creative thought? Perhaps, this is  actually what  it means to be creative.

But, does this bring us down a slippery slope where we hit a stagnation of human thought and hand over our future to our own inventions?

Or will there be a backlash against this technological determinism we’re heading towards?

At the moment, from my standpoint, the world belongs to thieves. I just can’t figure out if it’s always been that way, or always will be.

When Magic Dies

In continuing with my previous post, I can’t help but think about the magic world I was a part of, loved and still cherish. While not pretending to rise above the criticisms I’m about to lay out, my concern for capturing the imagination of an audience started with what happened in that performance art.

The effectiveness of a magician relies on one crucial component: the method.

The method, or the secret if you will, is everything. When an audience knows the method, or even think they know it, the act is ruined. A polite crowd will still come along for the ride, but the wonder of the moment is lost.

A professional, or one with an attitude of a professional, will either spend countless hours practicing until the method is invisible or spend the same number of hours on finding ways to misdirect the audience so they don’t suspect the method. The very best do both.

Early magicians will imitate the very best, or use the prescribed routines offered to them in their training books or videos. There is nothing inherently wrong with this when you’re learning as I find it akin to a budding musician playing the songs of their favourite artists. However, unlike a musician who plays cover songs and can have a fun career with it, you can’t have this in magic.

At some point, you’re going to need to bring your imagination and ingenuity into it (also a great marketing team—but that’s a whole other topic).

There was a worry that exposure videos on YouTube would bring the end to magic. Logically, if the success of a magician is dependent on the method, then the obvious conclusion is no magician can be successful thanks to the endless videos online that give away the method. However, this hasn’t been the case.

For one, careful and disciplined students of these videos can produce incredible magicians.

Next, magicians have learned to keep their best routines away from video sharing sites.

And finally, the average person doesn’t spend their entire evenings searching for these methods.

What I’m seeing right now is something similar that’s happening in education. There is a widening gap between the incredible magicians who are performing magic at levels that are unbelievably creative and original (very small percentage), those who are doing a decent job imitating them (a slightly larger percentage than above) and the rest who I lump into the categories of competent and horribly amateur.

I am not above reproach here. By the end of my tenure as a magician. I was in the competent category, at best, but always aiming for the top. The ambition was always to be an original.

I’m not seeing that anymore.

Just like the generative AI flood, magic seems to be aiming for that second tier. Let somebody else be original and the rest will imitate.

And just like students who use these tools in a poor and obvious way, the imitators will do the same, which is where the death of magic comes in. Nothing ruins magic like an act done poorly. And having to put in a great effort to make it look good is lost on so many.

Hence, a handful among a deep pool that rise to the top.

It’s just not enough.

That’s why I was so enamoured with Nate Staniforth’s work. He understood that magic was dying.

But maybe I’m wrong.

Actually, I hope I’m wrong.

I hope I’m only seeing a small slice of a much wider world and missing a bigger picture.

The thing is… I might be completely off base and the reality is much worse.

(Continued next post)

The Death of Imagination?

It’s hard not to wonder about the implications of generative AI as a theologian, philosopher and observer of culture. While the conversation for its future in education are in full swing, my best guess for that is a complete overhaul of the school system or a reversion to pre-digital methods. It’s hard to say which one would serve society better, but I’m sure some academic researchers completely disconnected from reality will decide its fate. I mean, why change the status quo?

Where my real concern lies in the imagination. Up until now, our tools have enabled us to be more productive and accelerate our thinking, but we’ve never hit a point where the thinking can be done for us. An argument can be made the Internet has already made this happen given the copy/paste that happens and echo chambers of social media, in which I would agree. However, that has more to do with laziness and a real low bar for critical thinking. Not imagination.

The worry is the thought we’ve hit the apex of creativity. Perhaps my slant is toward art, but it feels that we are in a flat line. It’s often said all stories we have are just takes on the same few themes (and Shakespeare supposedly did them best) and all our technological innovations are pale re-inventions of what already works (e.g. music streaming services are just radio channels with more DJs available). 

Are there exceptions?

Sure—every jump in ingenuity starts as the exception.

Where the exception differs is in the way it captures the imagination of the populace. Take, for instance, movies. It’s pretty clear we are running dry on ideas as we’re getting bombarded with remakes, sequels or yet another Marvel Cinematic release. Of course there’s an economic incentive for this, but to take chances on new ideas requires one to capture the imagination of others. 

It will be increasingly difficult to do that when AI models spit out what has been done before in a very predictable pattern.

At the risk of sounding like some Luddite alaramist proclaiming the end of humanity over the next technological development, as I’m not against the tools that make our cognitive processes easier, it’s whether access to these tools should be given without a need for them in the first place. We already have editors burned out from endless AI submissions for stories (to the point they’ve closed new submissions for the  foreseeable future), and readers of indie books getting pissed at AI generated stories (it’s so obvious that it hurts). 

What happens when this bleeds into music? Or movies?

My hope is a renaissance of imagination, similar to what happened in the 60s/70s with music. Artists in their imperfect forms experimenting with sound, voice and heart in their lyrics. It was a generation that still inspires today and a spirit that is desperately needed now because, if I can be so bold, our world has jumped the shark.

Still Looking For It

There’s a great U2 song that I often come back to at different points in life because it means a little something different each time. While I’m sure the original intention of the lyrics are vastly different than what most take from it (including myself), and this can be a real sticking point for any writer, I find it’s a good foundation to work from.

As a young person, my search for happiness involved the usual criteria of outward signs of success: good career, home, healthy, relationships, etc. All those things that we dream about from what we want in life, which are good things to have. To have otherwise is like playing life on hard mode. It’s possible, but why?

However, in the background of all of this was a seed of discontent; something unsettling.

Is this it?

The question eventually drove me to study mysticism in hopes of finding an answer. That eventually led me to study Theology where the question aggravated me further because the solace of religion was shattered by in learning about its blatant power structures that it doesn’t even try to hide.

The philosophy side served no better. While it’s a wonderful discipline that appealed to my off-centered, obsessive thinking (from which I’m forced to conclude all philosophers throughout history were actually on the spectrum of insanity), the question of meaning was moot. It’s not even discussed.

As life eventually settled in for me and I’ve come to a point where I’m at peace, new questions are festering. While they no longer cause existential dread or months of overthinking, they continue to spring up and keep me mentally off tilt.

They remind me that while I have everything I could want or need, there’s still something I’m seeking and haven’t yet found. Thankfully, I’m surrouded by incredible people who keep me grounded, allowing me to continue this search without losing my way.

But, part of me is beginning to think we never fully find what we’re looking for, no matter our circumstance, and the key is to accept that—to be comfortable with it.

Perhaps happiness is only in the search and not the destination and, more importantly, as Christopher McCandless discovered, is only real when shared with others. We can keep searching, keep seeking, and be comfortable that we may never fully get there, as long as we have others along the way to share in that journey.

So I still haven’t found it… but I also have.

A Freedom of Limitations

Give a child some paper and crayons and tell them to do whatever they want and they will shrug and idly doodle, quickly losing interest. Tell them to draw a nice picture for grandma and suddenly you see their creative spark fly as they diligently go to work.

As an educator, I’ve discovered to bring out the creativity in students, you can’t simply just leave things wide open. You have to set parameters and boundaries so they know what field they’re working in. A few will push those boundaries and want to step out (as expected), but they do so purposefully.

In a world that offers endless choice, the best course of action is to limit that choice for yourself. Set your own boundaries upon which you will be within and find the peace that comes with it. Otherwise, everything becomes overwhelming and we become anxious at all the options ahead of us.

For instance, I was at the grocery store recently and counted eight different brands of bread crumbs. Eight. I’m certain there are others as well if I was willing to venture enough, but we’re just talking about bread crumbs here, right? Having lived a neighbourhood away from a bean packaging factory, I can assure you the only thing different about the various cans of beans you find in the aisle are simply the label. I have to think the same thing is happening with other items such as those breadcrumbs.

Yet, wanting to make the best choice, many will spend unnecessary time deciding on which brand to get.
In my year of enough, there is a satisfaction of knowing that my limits are already set and I don’t care to look at anything else as I enjoy what’s available to me right now. Should something come along, I’ll make a note of it and move on without spending time dwelling on it.

While there’s a freedom of having choices available to us, which is a freedom I wouldn’t want taken away, the most effective use of it is to decide on what’s necessary and ignore the rest.

Time to Think

Our constant connection to the world has us consuming endlessly, whether that’s social media, news, or video feeds. Even the barrage of incoming text messages we receive is a further entrenchment that our minds have defaulted to standby mode, waiting for the next thing to come in so we can respond.

Sometimes it comes as an active response (replying, commenting, etc.), but more often it remains a passive one that has us considering it for a moment before ignoring it completely. The latter, unfortunately, has trained people today to simply ignore requests and abdicate responsibility by assuming that if they just don’t reply to anything, it’ll go away without any consequence. That is a personal pet-peeve of mine, but I’ll save the ranting for another time.

While it may be easy to point the finger at technology and say this is a new or recent phenomenom, education has been plagued by this for quite some time. There is a curriculum to meet and there isn’t enough time to digest each of its parts before moving on to the next. Sometimes the pieces require an active approach, but most of them are passive.

“Here’s the content. Got it? Great. Moving on.”

That’s why I have envy for Math program in Japan who choose to go narrow and deep with their content as opposed to wide and shallow. This is something I would love to see replicated across all disciplines. Mind you, some of the best teachers do this anyway without drawing attention to themselves.

On a much broader scale, what all this leaves out is time to think.

If you consider that many of the greatest thinkers, inventors and artists spent each day going for a walk, or bike ride in Einstein’s case, you see the one consistent habit. It’s spending time consuming, then giving time for the brain to think.

Consider the insights that come about during a long car ride—we often refer to this as “highway stare,” but the revelations that come about are interesting. It’s an active meditative state that simply allows the mind to wander and see where it will go. And if it goes nowhere, that’s fine as well.

I mean, how often do we get to a point where we can just be in the moment without putting pressure on ourselves to gain something from it?

However, the greatest value of taking time to think is it gives us time to react appropriately, allowing us to control the narrative of our lives rather than having it fed to us by society.

The Countdown is Here

This is the point in the year when we really start looking ahead at what’s to come, while also taking a moment to reflect upon what has been. It’s a yearly ritual mired in clichés and endless online bombardments in case we’ve gone more than a moment and forgotten about it. While you may sense a laden cynicism between the lines, much of it has to do with my own realizations and failures.

However, this upcoming year, I’ve decided to partake in one ritual in which I see a much more mindful approach.

A close friend of mine informs me of a word upon which he revolves himself around for the year. As part of his own reflection and growth, and I hope he doesn’t mind me briefly commenting on it, I’ve always found it to be a fruitful endeavour for him. However, it never occurred to me that I could take this on myself.

This is mainly because in my years of studying the deepest questions of life while also continually getting called out by others for my serious social shortcomings, I’ve been too arrogant to consider it. It’s kind of a big blind spot for someone who ministers to young people to completely brush something off without trying to understand it first.

Yet, here I am teaching courses on morality and ethics with the ultimate aim of happiness in one’s own life, failing to take away my own big lesson in the end—and that’s the word ‘enough.’

In practical terms, I’ve looked at my home library, my “to-watch” list, the devices in my home, workout equipment, plus all my other physical stuff and realized I have enough to satisfy me for several years—let alone the solitary year ahead.

I’ve also looked at my digital space, with all its endless files, apps and tools I’ve flirted with (I must’ve tried every note app and word processor at this point) only to recognize the tools I constantly come back to are more than enough.

Then there’s the many quality activities and games to do with my kids that we’ve barely touched. Wouldn’t this be a great year to get around to them all?

Finally there’s the incredible relationships I have in my life. These have always been the source of my strength and the encouragement to venture out, reflect and grow. By recognizing I have enough in every other part of my life, I can take more time to appreciate and work on these solid pillars instead of taking them for granted.

Perhaps in taking in this year and focusing on that word, I might even come to realize that who I am is enough and I can stop being so hard on myself. Strive to do better? Always.

Dwell in a hopeless comparison to a standard that can never be achieved? Enough.

In fact, it starts now.

The Commitment to Knowledge

I wouldn’t be much of a teacher if I didn’t have a love of learning and, more specifically, sharing that with others. It’s a part of the brain that never turns off—always clamouring for more.

Here’s the problem:

Very little of that knowledge is rooted i experience. Most of it is nothing more than disparate pieces of information that make for good trivia, but wholly less applicable.

Consider the time I had as a magician, where eveything from stage craft to showmanship could instantly be applied to my own set. However, this also built the foundation for the classroom where knowledge of timing and audience management are paramount.

Hence, my long time quest for acquiring knowledge for the sake of acquiring knowledge is being replaced with a curated look at how I can tack this on to a lived experience. Or, in the case of my students, how can I tack it on to their lived experience?

It’s one thing to be willfully ignorant, but it’s a whole other to know all the workings of a car engine and have no clue how to fix it.

Another Year Older

Another year where I’m a little less certain.

A little less wise.

A little less sleep.

A little less patience for B.S.

A little more forgetful.

A little more picky.

A little more willingness to admit when I’ve been wrong.

A little more appreciation for the relationships in my life.

A little more certainty I am where I’m supposed to be.

A little more love to share.

A little less life to live…

but

A lot more life in the days I have left.

In The Right Hands

I recently watched a movie based on the suggestion of a trusted friend and walked away from it with a sense of malaise. It was a high concept that was quite interesting, but riddled with plot holes and good actors that felt like they weren’t trying.

It had its moments, but couldn’t help but think it could’ve been a wonderful film if it was in the hands of another director. Mind you, depending on who that director is would change the film completely, which is the tradeoff one must accept. Style and execution don’t always line up with vision.

However, in the right hands, something good could become something extraordinary.

The question we need to ask is whether our hands are the right ones for the task. It takes some humility to say they’re not, and even more humility to admit we have a lot more work to do before our hands are the right ones.