Class Warfare Blog

March 31, 2021

A Mask Revelation

Filed under: Technology — Steve Ruis @ 10:36 am
Tags: ,

Claudia brought me some new masks she was trying out. I believe this is the eighth or ninth different commercial mask we have tried. The issues with the others are strain on one’s ears, fogging of eyeglasses, etc. all of the usual complaints. So, this mask was much like the other disposable masks we have tried. (We also had tried washable cotton masks.) I put one on to go get the mail and I immediately noticed a difference. When I inhaled the mask collapsed against my face and when I exhaled, it ballooned slightly away from my face. What that told me is that the air being moved was actually going through the mask rather than around the mask, constituting actual evidence that it was working. I hadn’t notice this effect on any of the other soft masks, actually, just the opposite—more air flowed around those masks rather than through them.

None of the masks we have tried so far are of the fairly rigid kind. I assume they, too, have advantages and disadvantages, but I like the feedback these masks give me (I’m working, I’m working . . . like the Little Engine That Could Mask it is).

I suspect we still have months if not years of mask wearing in front of us, so I thought these insights might be helpful.

Black Disposable Face Masks

December 23, 2020

/The Social Dilemma

Filed under: Art,Culture,Technology — Steve Ruis @ 10:24 am
Tags: ,

The documentary of this title is currently available on Netflix and I had passed over it quite a few times before viewing it, which I did last night.

The documentary mixes in taking head segments with little scenes in an ongoing drama of how social media affects a family. I could have done without the vignettes as the talking heads were quite spectacular. These were all people who either had something to do with the development of social media companies or had studied the effects of their existence in some detail.

The basic premise is that social media use algorithms to line their pockets . . . nothing wrong there, except that the algorithms have no mores, they just want to feed your attention more of what you are interested in. This results in a massive case of positive feedback for everyone who participates. Positive feedback is almost never a good thing.

The talking heads point out that we who participate are all being manipulated against any judgment applied either by us or the providers and it is dangerous.

Bless them as they say that there are no villains here. Nothing was done with intent to cause the problems that now exist. They found the inventor of the “like button” who explained what was behind its creation. An unintended consequence stems from the fact that we evolved in small social groups, in which it was important to be liked by a majority of one’s fellows. The social media platforms have extended that circle to thousands of strangers, often leading young participants into doing bizarre things to accumulate “likes ” from them. And to what end?

An expert on AI systems says that we all worry about when artificial intelligences get so powerful that they overwhelm human strengths, like SkyNet in the Terminator movies (accompanied by the crunching sounds of humans skulls beneath the feet and treads of robots . . .). But well before that point we would reach point in which AIs could overwhelm human weaknesses, a point they did not claim we are at yet, but they easily could have.

They discuss the effect of social media upon political polarization, even on whole nation’s stability and elections, and what might happen should an autocrat really use social media effectively.

From thinking I knew the topic well, I found myself much better educated for having viewed this doc. If you have also viewed this documentary, what do you think?

December 22, 2020

At the Risk of Being Overbearing . . .

I offer a link to yet another aspect of the Pfizer vaccine roll-out kerfuffle. This post explains why the critic “IM Doc” was disappointed in the article in the New England Journal of Medicine, which exists to inform people, especially doctors, regarding what they need to know.

Whether this can be laid at the fee of the NEJM or Pfizer is almost irrelevant (almost, but not quite). It does, however, lead one to wonder how informed the opinions of our own doctors are.

A Document Maven Looks at the Pfizer Vaccine Paper in the New England Journal of Medicine

 

December 15, 2020

Important: Before You Line Up for the Pfizer Vaccine . . .

Filed under: Economics,Reason,Technology — Steve Ruis @ 8:25 am
Tags: , ,

I think it is imperative that you read this article. The hand waving going on in the news media, even scientific publications, is of the kind magicians use: to distract you from what the other hand is doing.

An Internal Medicine Doctor and His Peers Read the Pfizer Vaccine Study and See Red Flags [Updated]

December 5, 2020

An Error of Extrapolation

Filed under: Culture,History,Technology — Steve Ruis @ 12:29 pm
Tags: , , ,

It was a simple error, made long ago, but I have kept it up all of these years. It started from the factoid that the life expectancy of human beings (of the American kind) at, say, the first decade of the 20th century, was roughly 45 years. This was interesting to me because this was close to when my parents were born (1912 and 1919). By the end of the first decade of the 21st century, the life expectancy of American females was well over 80 years and American men almost 80 years, so one can conclude that, well, things just keep getting better.

The extrapolation that was in error in my thinking was that from now back to about 1910 there was this large increase in life expectancy and that if one went farther back from 1910, similar changes were expected. Going back to our prehistoric ancestors, their lives must have been nasty, brutish, and short, as claimed by Thomas Hobbes. But in doing so, I made a major error, one of a statistical sort.

What do you think was the life expectancy of our hunter-gatherer modern human ancestors? If you say “fairly short” you will be somewhat right but let me ask another question: at what age did those human relatives usually die (essentially of old age)? This is an interesting question and it has an answer. Our hunter-gatherer forebearers lived well into their sixth or seventh decade, not much different from what it is now. How can this be so?

This will involve a little math, but I used simple numbers to keep everything simple, and well… sheesh, relax, you don’t have to do the math, just read it. Okay, consider a population of 100 humans who all grow up and die at an average age of 60 (some a little younger, some a little older). This means their life expectancy, at birth, was 60 years. What would happen to that life expectancy, though, if 10% died at birth? It drops to 54 years, even though 90 live to die at about 60. And if the infant death rate were 20%, the life expectancy would drop to 48, even though 80 live to die at about 60.

It is clear that the survival rate of infants was much lower in prehistoric days, and so their life expectancy, from birth, was dragged down. But if you survived for five years, better 10, you could expect to live into your 60s or 70s.

Okay, let me now go back to life expectancy in the early 1900’s. It was about that same as it was for our prehistoric ancestors! So, roughly 5000 years of civilization brought what in terms of progress? I think what we got were broader bell curves. The rich did very well indeed, but the poor did very poorly indeed . . . again, the curse of averages. So, the big question is what did civilization give us in the way of progress? For the vast majority of us, it was diddlely squat.

And yet, we have this impression of the inexorable movement toward “greater progress” to come. Things will “keep” getting better! Right . . . !

When people are asked what they want from their jobs, they invariably put close to the top of the list “greater autonomy” in their work, that is the ability to shape what it is that they do. Some degree of control is desired, instead of being told by a supervisor what to do and when to do it. So, what did hunter-gatherers have? Almost complete autonomy. Plus they lived, and still do in remote places, in quite egalitarian societies, and do “work” for only a small part of their days. All this was sacrificed when people were forced into becoming agricultural workers. Plus the poorer diets and close proximities of other people and domestic animals led to human beings being shorter, lighter in weight, being more disease ridden, including dental problems, and having shorter life spans.

Yet we continue in our delusion that being civilized is “better,” even morally so. (“What a piece of work is man …” Shut up, Wil!)

More on this later.

Addendum My mother lived to be 86 and my father 80. Your life expectancy goes up the older you get! There are estimators available on the Internet.

August 2, 2020

There’s Wrong and Then There is Wronger (and Wrongest?)

Filed under: Culture,Technology — Steve Ruis @ 7:52 am
Tags: , ,

I was watching the Cubs baseball game last night and the announcers announced the temperature at the start of the game was “room temperature,” right at 72° F. An inning or so later one of them wondered why 72° F ended up standard room temperature (here in the U.S.). So, during a commercial message one of them went online and found an answer. He stated that 72° F was standard room temperature because that temperature was derived “from internal body temperatures being 98.6 degrees ±, thus making the temperature of skin to be around 72-76 degrees Fahrenheit.” So, room temperature was skin temperature apparently. The two announcers lauded having so much wonderful information at their fingertips and how it was so much better to know than to sit in ignorance.

I am sitting there thinking WTF?!

There is a clinical term for when one’s skin temperature is equal to room temperature, what was it now . . . oh, yeah, DEAD! My memory came up with a number of 92° F for skin temperature and a very quick search came up with a better number, a range actually: roughly 92° F to 98+° F. (Your fingers are exposed to the environment more so than, say, your armpit and so are colder. Your armpit is close to the conditions existing inside your body so the skin temp there is close to your internal body temp.)

So, the answer the baseball announcers come up with, how could it have been so wrong? Well, it was a quote from the Quora web site. Quora is a question and answer website on which people ask questions (sincere and not so) and other people supply answers (also sincere and not so). Whether the answers are right or wrong or in between isn’t curated.

So, the answerer on Quora either was blowing smoke or was told something that sounded right by someone else, or . . . whatever, and then the announcers shared this incorrect information with a couple of million people.

This is certainly indicative of our current culture.

We are at least past the “it has to be right otherwise they wouldn’t let them put it on the Internet” stage but not very far past. As was the case before the Internet, you have to know a lot to be able to find correct information. My favorite example from those pre-Internet days was looking up how to spell a word in a dictionary. So, to begin what do you need to find the listing for that word in the dictionary? The spelling, of course!

So, if you are looking for something on the Internet, you need to look at more than just the top listing provided by a search engine. If you have no way to verify whether what you looked up is reliable, you need to refer to several such items to see if you can find a consensus. You need to consider the sources of those bits of information. (This was one of the errors the broadcasters made; one of them needed to know that Quora answers are not necessarily dependable.)

And then when you mention what you have found, you begin the statement with “According to <reference> . . . blah, blah, blah.” This does not include using “According to the Internet . . .” as the Internet has no opinions or knowledge of its own, only what has been posted there by others. (I know the Internet. The Internet and I are friends and, trust me, you’re no Internet.—If you recognize the quote from which this was crafted, you are older than you look.)

And this is how any number of conspiracy theories and bogus movements get started. I honestly do not believe that there is a Flat Earth Society, or whatever they are called now, that is full of committed believers. I am more likely to believe it is full of iconoclasts and people who like attention over approval (almost always males, btw). But some of these other people are not healthy psychologically and it is not good for them or society in general to be so provoked.

If you are wondering why “72° F (ca. 22° C) ended up standard room temperature here in the U.S.” you can look it up and the real answer makes a great deal of sense.

June 23, 2020

Typography Evolves, Not Necessarily for the Better

Filed under: language,Technology — Steve Ruis @ 10:54 am
Tags: ,

I am a bit of a typography snob. I work as an editor and I work with people in their teens and their nineties. I note that people quite old tend to show some quirks of their past. For example, at one time English, as German still does, capitalized most nouns. We have moved away from that practice, but some older writers overcapitalize. It was also the practice to have a space before colons and periods which is no longer the practice, so as mentioned, things change.

There is also a slow morphing of compound nouns. In the 1930’s it was quite common to see to-day and to-morrow in print and now the hyphens are gone. This is a common process. A place in one’s home to have a fire becomes a fire-place and then a fireplace. The same thing happened to sail-boat, foot-path, black-face, skin-head, and dog-house.

Currently we are seeing another transition, one I hope does not stick. This is the recent practice of only capitalizing the first letter of an acronym, an abbreviation formed from the initial letters of other words and pronounced as a word, for example NASA. Back in my early days these things were typed out thus: N.A.S.A., F.B.I., and C.D.C. After a while we dropped the periods as being superfluous and so we got: NASA, FBI, CDC, CIA, SCOTUS, etc. This was acceptable because there were very few other situations in which words were formed from all capital letters. No one would be confused seeing NASA instead of N.A.S.A. But now I am seeing Nasa more often than not.

If the “all capitals” rule for acronyms is taken away, as is becoming the current practice, the possibility of confusion increases a great deal, especial for young or new readers of English. I tend to approve of such changes when they either (a) simplify communication or (b) make communication more accurate. In this case I don’t see what is saved. If I type <cap lock>,n ,a ,s, a, </cap lock> instead of <shift> n, a, s, a, I am not really saving a lot of effort.

I went to Wikipedia to consult a list of acronyms (and their ilk, such as initialisms) and I limited myself to just those starting with A and C.

Some of these, such as CAP, which stands for Civil Air Patrol, would easily be misunderstood if written as Cap, possibly referring to a piece of headgear, especially if the word begins a sentence, which always begin with a capitalized letter anyway. Others of this kind are:
FOE  Friends Of The Earth
ACE  Allied Command Europe
ADAGE  Air Defense Air to Ground Engagement (simulation)
AID  U.S. Agency for International Development
AM  Amplitude Modulation
CARP  Computed Air Release Point
CART  Championship Auto Racing Teams
CATS  Computer Active Technology Suspension
CIAO  Critical Infrastructure Assurance Office
CIS  Commonwealth of Independent States
COBRA  Consolidated Omnibus Budget Reconciliation Act of 1985
COIN  Counter-Insurgency (military)
COPE  U.K. Committee On Publication Ethics
CORE  Congress of Racial Equality
CREEP  Committee for the Re-Election of the President (Nixon)
Plus there are any number of these which could appear to be a person’s name, the first letter of which is typically capitalized.
TERI  Tata Energy Research Institute
ANA  All Nippon Airways
COLT  Combat Observation and Lasing Team (military)
CHiP  California Highway Patrol

Since these came from lists with just these two letters of the alphabet, I am sure there are hundreds of other terms that could also be sources of confusion.

I do not intend to adopt this new practice and hope that it dies out over time as being counterproductive.

How do such things get started? I do not know, but my guess is in magazines. Magazines are always looking for typographical ways to appear trendy, on the forefront of the topic they cover. Magazines are responsible for article and book titles now being formatted as if they were sentences (few are), which I believe emanated from ad copy. A header in an ad, if it appears to be a sentence with no “full stop” at the end encourages people to keep reading to find closure for the idea begun to be stated.

April 13, 2020

Election Security, Election Trustworthiness

Filed under: Politics,Technology — Steve Ruis @ 11:01 am
Tags: , , ,

I was watching a documentary called Kill Chain, last night. This was about how easy it is (not would be) to hack into the electronic systems used for our elections. A fact that by itself undermines the integrity of our election process. At one point the people leading the documentary took a series of machines to a hacker conference in Las Vegas and asked people there to try to hack the machines. So, with little to no preparation and only the tools that they had on them, the play began. In just a couple of days, every machine available (including all of the ones currently in use) was hacked. These were casual hackers working part-time attending a conference. As the hosts commented, in Russia and other countries there are highly trained and motivated professionals working 24-7 to do the same. How hard could it be?

That the technology was 16 years old, with four years being a long generation of computer hardware, this was hardly a surprising outcome. The documentary went on to document several rather egregious examples of hacked elections, so why hasn’t there been federal action to forestall our elections being undermined?

The documentary showed a clip of Mitch McConnell, the majority leader of the U.S. Senate, saying that any such anti-tampering legislation would have to be bipartisan to be brought up for a vote in the senate. Then various senators pointed out that at least four bipartisan anti-tampering bills had been forwarded to Senate leadership and none had been brought to the floor. Each had been killed by . . . wait for it . . . wait for it . . . Mitch McConnell.

Even though McConnell seems to be in the pockets of the Chinese and/or Russians, it is quite extraordinary to accuse a sitting Majority Leader of such a treasonous act, so the politics are more likely to be more local.

So, think about this. Think about the current GOP membership and the current Democratic Party membership. On one hand you have CEOs who can barely type and bankers and the like and farmers and soldiers and on the other you have all of the New Age hippie computer nerds in tie-dyed teeshirts. Which party do you think would have the better hackers? Yeah, it was obvious to me, too.

So, why is Mitch McConnell acting to protect Democrat election hackers?

Why would he betray his own party like that? In any kind of reasonable contest, the hippie Democrats could hack the shit out of a band of GOP members, so why is Moscow Mitch protecting Democrat hackers? What do they have on him to make him their puppet? Do you think they are controlling the outcome of his re-election? What would make a staunch rock-ribbed Republican into such a toady for a bunch of hippie hackers?

PS Watch this documentary!

 

 

March 18, 2020

Can Software Be Conscious?

Filed under: Morality,Technology — Steve Ruis @ 11:55 am
Tags: , , ,

This was a question addressed at a conclave of philosophers, software developers, and their ilk, but I think this is the wrong question. I think the question is can a “computer” be conscious, with the word “computer” standing in for some combination of hardware and software.

A major leg up on being “conscious” is being self aware. Researchers devised the mirror test to see if animals recognized their reflections as themselves or as another animal or at all. I don’t know if this test really tests for that and I do not know whether any other animal has “passed” this test, but it does show the importance being placed upon being self aware by consciousness researchers.

Now, let me begin my argument with a thought experiment. In my sport we depend a great deal on proprioception, which is the awareness of where our body parts are in space. For example, you can pick up a glass of water, close your eyes, and take a drink from that glass with no problem. We “know” where our hand is and we can feel the glass in it and we “know” where our mouth is and we don’t need to be “talked down” (a la every airplane crisis movie ever made) to delivering that load where it is intended.

This ability is not obvious to us, but any disruption of it results in quite some confusion. For example, if you get an injection of a pain killer in a gum for some dental work and part of your tongue goes numb, don’t expect to be able to talk and be understood until that anesthetic wears off. The position of our tongue in our mouth is necessary information for being able to form sounds.

Another sense we suffer from losing is our sense of balance. If you have ever had extreme vertigo, you will know what I mean. But if we have a stuffed up nose from a cold or other, we seem to get by quite adequately without a sense of smell.

Now, as to the thought experiment. You are lying on a bed and you lose your senses, one by one. First, you lose your sense of sight, which means full “fade to black,” not just what you can still see when you close your eyes. Then your sense of smell, then hearing, then touch, then taste, finally your sense of balance, then proprioception. You cannot feel the bed under you or the breeze blowing in the window or hear the birds chirping outside.

This why the science fiction trope of having a brain in a jar doesn’t work. How long do you think you could remain sane in this state? You couldn’t even scream for help. It is questionable that you would even be able to vocalize.

Software does not have “sensory input” without hardware. And, it seems that we are rapidly developing sensory inputs for computers. A common theme of news commentaries is face recognition software, which is, of course dependent upon video feeds as “sensory input.” An article headline in this week’s Science News is “An AI that mimics how mammals smell recognizes scents better than other AI.” AI stands for artificial intelligence or “super-duper computer.” Computers, for quite some time have had the ability to monitor the temperature of their CPU’s and can tell you if they are experiencing a “fever.”

It is not a big stretch of imagination that if we continue to add “senses” to computers and allow those computers to monitor their sensory inputs, we will have a much greater likelihood that one of those AIs will become self aware.

Now, I am sure that some people will argue that these computers will only be simulating self awareness or some other such construct, but since I do not see that we fully understand our own self awareness, how we could build machines that would have self awareness exactly the same as we do. Nor do I see that that is a necessary condition. Self awareness is self awareness, no matter the mechanism.

I have read science fiction and fantasy for at least 60 years and have read more than a few stories about self aware “computers” and what they are capable of, including feeling something akin to death when they are “turned off.” In the latest season of the show “Altered Carbon” on Netflix, the main protagonist’s AI does want to perform a reboot even though he is glitching up a storm because he doesn’t want to lose memories which are precious to him. Apparently recorded memories are just not the same as “real” ones. A major step along this path, a path that leads to self aware “computers,” “AIs,” and whatnot is providing what can stand in for senses and internal monitoring of those senses. We seem to be barreling down this path at great speed, so I think this may happen in the next 50 years, if not sooner.

 

 

March 16, 2020

We Are Done For

I was refreshing my memory of some of the earliest episodes of Westworld (on HBO) before I tackled the beginning of the third season. In the first episode Mr. Ford (played by Anthony Hopkins . . . I love Anthony Hopkins and have since Lion in Winter) musing on our unknown scientific and technological future mentions that, one day, we may even conquer death and as he finished that thought, he follows with “And then we’d be done, finished. We’ll never get any better.”  (This quote is from memory so it is probably slightly “off.”) This is one of the more brilliant throw away lines in this series.

What Mr. Ford is alluding to is that death is an absolutely vital part of the natural selection process at the heart of the theory of evolution. If we conquer death, then mutations and other changes, make no difference in our survivability and this marks the end of physical evolution.

Of course, social evolution is still possible . . . but then so is social devolution. We could change socially in ways that benefit no one and, in the short term, we could eradicate our own species before corrective actions prevailed.

The current Coronavirus-induced pandemic is a quite vivid sign as to how ill-prepared we are for such global events. With Trump in the White House and some Hindus drinking cow urine as a palliative for the Coronavirus, we have a full spectrum idiocy guiding our actions, it seems. Well, that and the normal “panic and run around like a chicken with its head cut off” human behavior (that is not a baseless metaphor, I have observed that behavior). “Toilet paper, toilet paper, my kingdom for toilet paper!” Where is Shakespeare now that we need him?

Next Page »

Create a free website or blog at WordPress.com.