Who knows?

In light of the massive fraud case being pursued by the federal government against a company that runs for-profit colleges in culinary, visual art, and other programs one wonders where people can attend higher education any more and learn anything practical.  In his book, “Shop class as soul craft: an inquiry into the value of work”, Matthew Crawford elegantly examines this problem, and how a culture that loses the ability to appreciate work of the hands is in trouble.

Most liberal arts institutions and major state universities still follow what we might think of as a books and desk curriculum, or more aptly a books-desk-computer curriculum.  Lab spaces are for the use of chemistry, biology, a few engineering, some medical, and studio art courses.  Very little else of what is done is “hands-on”.  That is usually considered more base, less theoretical, and relegated to what we used to think of as community colleges.  But even community colleges started dumping hands-on courses favoring a “college prep” curriculum, and becoming less identified with “training”.  High schools did the same, tossing out shop classes in favor of computer lab space.

But as I searched for an expert plumber, electrician, heating and cooling expert, bricklayer, and various other “hands on” professionals in the past several years, one thing became clear.   The people who spent their lives developing exquisite expertise in essential areas of functionality are growing old, and there is no one to replace them.  Every professional I spoke to agreed, and lamented the lack of preparation programs in both high school and community college, and the cultural perception that somehow these careers are less valuable, less honorable, less important than say accountant, banker, business manager (all of which these people learned to be, or at least learned how to hire others to do), or other higher status fields.  One went on to say that he sees so many people going to the gym to work out, but if they had jobs working with their hands and getting off their behinds every day they would not need to.  It was an interesting point of view.

I recall my father talking about all the “college educated” engineers who could not see when their CAD drawings, like an Escher print, would not work.  He would often take wire and bend it for a 3-D representation, to show them the error of their ways.  They were never very happy about it, but over time came to depend on his visual-spatial and hand skills- and offered a grudging respect.

As I get older, I wonder who will know?  Who will know how to do anything- are we expecting just to do searches on our computers, run down to the Lo-HumDepo-Ace’s, get what ever standardized Chinese made supplies they have and do everything ourselves?  There are a lot of issues that one must be an expert in to truly understand, to strategize, to plan, to solve.  Just reading a how-to page will never be enough.  Husband did a fine job on 80% of the house.  But that final 20% needed an expert, and he relied on experts to check the work he had done as well.

All my sewing machines are still machine based, not digital.  It is a cost-benefit analysis for me:  when something breaks or goes haywire on the computerized machines, it is hideously expensive to fix and often requires a new machine.  They are also plastic through and through, and the wear and tear on them adds up quickly.  My mechanical machines (mostly metal) still require maintenance (which I am learning to do), and sometimes specialized care.  Then, they are usually good to for a long, long time.  But the people who know how to work on these machines are also aging, dying, and passing into memory.  They take entire worlds of knowledge with them, knowledge we may not get back.

There is also the aesthetic appeal of machines, of working with one’s hands, and seeing the results.  It not only builds confidence and practical skills, but also develops complex 3-D thinking in ways nothing else can.  I think Crawford is right: until we begin to value work of the hands again, we will continue losing not only a part of our culture, but an important part of our minds.  As discussions about rewriting school curriculum continues, let us not disregard the shop classes, the labs, and the field work that helps us become what we are- 3 dimensional beings made of muscle and fine motor skills.  Also, let all forms of learning and technology be respected as helping us become the best of who we can be.   Let us plan sensibly for all the jobs/roles we need to run our wires, pipes, and build our communities.  In sum, I hope we are not left standing around asking, “Who knew?”

Que?

I was listening to Kate Bush today and thumbing through my approximately 30 gig of music (and for the record, I wish I had more).  I remember when ITunes first came out with the data analyzer that attempts to make recommendations based on your library.  It was doomed to fail in so many ways (one because I have enough items that ITunes does not carry that it would be hard to adequately analyze if the case- my collection- has so much data that has to be ignored as unknown), but the errors are an interesting problem.   I have felt like much of my life I have been dismissed as an outlier.  In statistics, it is a convenient way to discard data by saying, in essence, “it does not fit anything we can use and is so far out of our margins of experience we can just throw it out”.

In data mining, eclectic cases could be interesting- but because they do not neatly provide predictive lines of analysis, they are more often than not dismissed as noise in the data.  Think of smoke coming out of your toaster (which is how I thought of ITunes when it tried to send me recommendations.  It was frying itself a little trying to make matches.  And yes, this is a vague reference to flying toasters.  And no, not from BSG).   “Noise” or errors in the data can also be used to refine pre-existing category systems (think when Photoshop tries to delete an object then insert surrounding inferential data to fill in where the object was removed), so that the greater the possibilities of realization of a concept is explored in real experience, it adds to the complexity of how that idea is understood and manifested, or reflected back.

On Facebook, I tried to limit the amount of information that could be freely collected by the site and its “partners”.  One of the things I did was put in a wildly inaccurate birth date.  I like messing with data mining.  I purposely have a small friends group too, so anyone who knows me well already knows my birthday and knows my penchant for messing with freeloading data systems.

When one knows oneself, we know the parts that are ridiculously stereotypical and the parts that are eclectic.  The mash up and resulting paradoxes are, I think, just part of being human.  Go figure that one out cognitive science- ok, I know people are trying but really- so many current models are so inadequate or just rehashing problems philosophers and early psychologists have better articulated.

There is something to the notions of innovation, eclecticism, and creativity that is compelling.  I formally studied these ideas long and hard for many years, for altruistic reasons as well as knowing I was a thorny case for the subject.   I found satisfying models and metaphors in work from people like R. Sternberg (the concept of Practical Intelligence), from the bulk of expertise research, the personal musings of many artists, musicians, and writers, aesthetics, and many other sources.   At the same time, there is an element of the wisp of smoke coming from a toaster about the idea of creativity; to try to understand open ended, eclectic thinking is to sometimes burn out the very tools of analysis you are bringing to the subject.

On a practical level, I have enjoyed the research that quite definitely shows if we do not use our brain in challenging and novel ways it will atrophy, and contribute to dementia.  As often as I feel like a complete alien in every culture and subculture I have been in, it is comforting to know my “sideways” way of viewing things may be helping me age better.  But does that also mean we are doomed to always be incomplete, constant learning beings if we are to survive, and survive well- thereby possibly both limiting our usefulness (incomplete, exploratory) as well as making us adaptable?  Ah, to be or not to be (apologies to WS).

We romanticize creativity, but in practice most folks are terrified of things that are different, and skeptical of the new.  I am not an early adopter of technology (one because I can’t afford to be), but I am deeply interested in what objects and processes new technology is applied to. I have been called blissfully naive in my life, and I took it as a put down.  But looking back, it was a habit of not making rash judgments, extending the time to understand someone or something (and yes, therefore putting myself in harms way from time to time) that garnered me that label, combined with a seemingly insatiable thirst for new experiences and stimuli.  I also know that withholding indexing or categorizing, being flexible in how one views an experience, is a core component of creativity.  It is the open-ended question without any one absolutely right answer that fascinates.  Wisdom for folks like me is learning what situations require immediate categorization and what situations can be allowed the extra time and thoughtfulness necessary for satisfactory input.

I don’t think data mining systems have reached the wisdom stage yet, and therefore will continue to emit wisps of smoke when confronted with the eclectic cases; or take so much of what I am and toss it out as outlier that what is left is a pale imitation, a grossly inadequate summary.  I think data mining (and life in general) does this to an extent to all of us, regardless of how predictable we may seem.  Some assumptions that result may be useful and help improve the tools that operate in our lives, both foregrounded and backgrounded.  Others lead to horrible policy and situations like those at airports when checking in for a flight.  In essence, everyone and no one are terrorists.  A complete paradox of institutionalized applications resulting from awful data analysis systems.

Simply using “that” word in this blog selects me out for further analysis in the gross internet search and filter systems of some intelligence programs.  Try and mash up all the topics I have crossed here and see if I am a no-fly—wisps of smoke, then throwing out most of what is written and determine I am not a threat due to inconsequential outliers.  See?  It is useful, right?

If only rejection were always so useful.

Husband recently said with a laugh, “I know WHO you are (and he trusts what he knows), but I don’t know WHAT you are.”   He struggled to explain- and remember, this man is an accomplished artist- that he meant he does not know where I fit in.  Funny, I don’t either most of the time.  But I’ll keep taking those wisps of smoke and the subsequent collapse into absurdity and laughter as the only way other than despair and insanity to synthesize the issue.  In laymans terms, come to a temporary peace.  But as many of us know, it is a peace that will soon be disturbed by the enjoyment of the knotty problem, the hurtful surprise, or the macro-level existential paradox that never ceases to exist in the back of every conscious mind.

So good night all you toasters, running your programs.  Maybe someday that AI will learn how to integrate the errors that lead to melt down or burnt toast; or maybe there will just not seem to be a good reason to mirror us and you’ll leave us alone.

A weakness for cheese and a fondness for robots

I joined Facebook recently.  What a strange phenomenon it is.  I appreciate getting to share photos with relatives and friends who live at a distance.  I even appreciate people I knew finding me/being found, and being able to post links and other pieces of errata.  But I am still trying to understand this twenty-four hour news cycle, Twitter-fomatic orientation to the world that seems the domain of a generation younger than I.

Someone passed on the “tell me twenty five things about yourself” irritant that is popular right now.  I cringed.  I did not like the trend when it started, and am not much of a joiner that way.  My first response was flip, “Fingers, toes, eyes, ears and nose”.  When I considered it fully, I finally just said “oh, read the blog you lazy thing” and forgot about it.  Then, I came up with my own version of the challenge.  If you took a snap shot of your life right now, what would you title it?  I don’t particularly mean the micro level of the actual minute, but a more general sense of reality.  I decided mine would be “A weakness for cheese and a fondness for robots”.

Another friend forwarded a photo essay about egregious acts of culinary evil that “make you fat” from restaurants around the US.  Well, they are not responsible for my particular issues, I thought, but certainly having become more sedentary and having a weakness for cheese has.  When we were courting, Husband made a joke about my fondness for fermented milk by sending me the biggest hunk of cheddar (from Wisconsin) I had ever seen.  The man understood me.  The title also reminds me in some vague way of all things Monty Python, with some of Aardman’s Wallace and Gromit series thrown in (and I do like a bit of gorgonzola).   Cheese somehow sums up both the domesticity and absurdity I find myself in right now, and my fondness for it is both my downfall and my pleasure.

The robots reflect my Husband and sons’ fascinations.  This birthday the boys are begging for a robot cake, which I have figured out how to make thanks to many examples from Google images (no, not a cake that walks around- although that would thrill them, and the subsequent active narrative of destroying it).  Robots are everywhere in our home, and I have yet to completely understand the devotion to them.  Which brings up my latest food related idea.  Normally, if I had caught eldest son drawing on paper with one of my exquisite small bars of Madagascar chocolate that he obviously found on the ‘fridge door behind the butter (ok, I should have hidden it better), I would have been angry.  But it was cool in the house, and like a crayon it glided over the paper making an interesting brown robot.  Most chocolates have a high wax content, so this came as no surprise really, but my special bar had a low wax content so in any warmer weather it would not have had the same effect.  I thought about an Iron Chef episode in which a young pastry chef marveled and amazed the audience with his mastery of sugar, food coloring, and heat.  I wondered if such a chef could make a sort of edible rice paper parchment, and a chocolate crayon to draw upon it.  Little cartoons could be quickly sketched, and stuck at odd angles into small mounds of homemade ice creams.   Finally the Chef could respond to his devotees like political cartoonists in dark bars and cafes.  I have never heard of such a thing, so maybe I have actually come up with something new for once!  You “heard” it here first, folks.

When I joined FB and logged in as “trying to figure this out”, one witty person told me to ask a neighborhood kid.  If it were pure mechanics, I might have.  But the statement was addressing the larger issue of cultural context, and the varieties of meanings behind all the applications.  I still have not mastered Facebook, nor any of her sister circles of internet hell.  But I am learning, and to my surprise- am glad.  Maybe I am moving beyond the cheese.  What would robots eat?  What does this mass of circuitry consume other than time?  I wondered out loud.  My son’s reply:  “Metal mom.  Just like Iron Giant.”  Well, I have been a little short on iron lately.  Pass me the supplements. . . or perhaps no.  Machine age steam punk aside, that time has passed.  Wetware and silicon, electricity and bio-projects swirl in my head.  High-energy consumption gray matter, and we are back to a need for glucose doping.  That greediest of organs, our brains.  So maybe I am back to cheese after all, in moderation, with some crackers. 

Shaped like robots, of course.  Such conceptual fractals life seems to be, and I am out of bandwidth to follow the pattern for now.

“See you” on Facebook.