1062 stories
·
73 followers

Meta Blog

1 Comment

My recent Amazon-exit piece got an order of magnitude more traffic then even the post popular outings here normally do. Which turned my mind to thoughts of blogging in 2020, the why and how of the thing. Here they are, along with hit-counts and referer data from last week. Probably skip this unless you’re interested in social-media dynamics and/or publishing technology.

Numbers

In the first week after publication, Bye, Amazon was read somewhat more than 614,669 times by a human (notes on the uncertainty below).

The referers with more than 1% of that total were Twitter (18.13%), Hacker News (17.09%), Facebok (10.55%), Google (4.29%), Vice (3.40%), Reddit(1.66%), and CNBC (1.09%). I think Vice was helped by getting there first. I genuinely honestly have no idea how the piece got mainlined so fast into the media’s arteries.

For comparison, my first Twitter post is up to 1.015M impressions as of now and, last time I checked, Emily’s was quite a ways ahead.

It’s hard for me to know exactly how many people actually read any one ongoing piece because I publish a full-content RSS/Atom feed. Last time I checked, I estimated 20K or so subscribers, but nobody knows how many actually actually read any given piece. If I cared, I’d put in some kind of a tracker. That 614K number above comes fro a script that reads the log and counts the number of fetches executed by a little JavaScript fragment included in each page. Not a perfect measure of human-with-a-browser visits but not terrible.

But aren’t blogs dead?

Um, nope. For every discipline-with-depth that I care about (software/Internet, politics, energy economics, physics), if you want to find out what’s happening and you want to find out from first-person practitioners, you end up reading a blog.

They’re pretty hard to monetize, which means that the people who write them usually aren’t primarily bloggers, they’re primarily professional economists or physicists or oil analysts or Internet geeks. Since most of us don’t even try to monetize ’em, they’re pretty ad-free and thus a snappy reading experience.

Dense information from real experts, delivered fast. Why would you want any other kind?

Static FTW

ongoing ran slow but anyone who was willing to wait fifteen seconds or so got to read that blog. One reason is that the site is “static” which is to say all the payload is in a bunch of ordinary files in ordinary directories on a Linux box, so the web server just has to read ’em out of the disk (where by “disk” I mean in-memory filesystem cache when things are running hot) and push ’em out over the wire. (The bits and pieces are described and linked to from the Colophon.)

It turns out that at the very hottest moments, the Linux box never got much above 10% CPU, but the 100M virt NIC was saturated.

I’ve never regretted writing my own blogging system from the ground up. I’m pretty sure I’ve learned things about the patterns of traffic and attention on the Internet that I couldn’t have learned any other way.

If I were going to rewrite this, since everything’s static, I’d just run it out of an S3 bucket and move the publishing script into a Lambda function. It’d be absurdly cheap and it’d laugh at blog storms like last week’s.

It’s not a top priority.

Read the whole story
cjmcnamara
13 days ago
reply
blogs! they're good
Share this story
Delete

Social change and linguistic change: the language of Covid-19

1 Share

It is a rare experience for lexicographers to observe an exponential rise in usage of a single word in a very short period of time, and for that word to come overwhelmingly to dominate global discourse, even to the exclusion of most other topics. Covid-19, a shortening of coronavirus disease 2019, and its various manifestations has done just that. As the spread of the disease has altered the lives of billions of people, it has correspondingly ushered in a new vocabulary to the general populace encompassing specialist terms from the fields of epidemiology and medicine, new acronyms, and words to express the societal imperatives of imposed isolation and distancing. It is a consistent theme of lexicography that great social change brings great linguistic change, and that has never been truer than in this current global crisis.

The OED is updating its coverage to take account of these developments, and as something of a departure, this update comes outside of our usual quarterly publication cycle. But these are extraordinary times, and OED lexicographers, who like many others are all working from home (WFH, first attested as a noun in 1995 and as a verb in 2001), are tracking the development of the language of the pandemic and offering a linguistic and historical context to their usage.  

Some of the terms with which we have become so familiar over the past few weeks through the news, social media, and government briefings and edicts have been around for years (many date from the nineteenth century), but they have achieved new and much wider usage to describe the situation in which we currently find ourselves. Self-isolation (recorded from 1834) and self-isolating (1841), now used to describe self-imposed isolation to prevent catching or transmitting an infectious disease, were in the 1800s more often applied to countries which chose to detach themselves politically and economically from the rest of the world.

As well as these nineteenth century terms put to modern use, more recent epidemics and especially the current crisis have seen the appearance of genuinely new words, phrases, combinations, and abbreviations which were not necessarily coined for the coronavirus epidemic, but have seen far wider usage since it began. Infodemic (a portmanteau word from information and epidemic) is the outpouring of often unsubstantiated media and online information relating to a crisis. The term was coined in 2003 for the SARS epidemic, but has also been used to describe the current proliferation of news around coronavirus. The phrase shelter-in-place, a protocol instructing people to find a place of safety in the location they are occupying until the all clear is sounded, was devised as an instruction for the public in 1976 in the event of a nuclear or terrorist attack, but has now been adapted as advice to people to stay indoors to protect themselves and others from coronavirus. Social distancing, first used in 1957, was originally an attitude rather than a physical term, referring to an aloofness or deliberate attempt to distance oneself from others socially—now we all understand it as keeping a physical distance between ourselves and others to avoid infection.  And an elbow bump, along with a hand slap and high five, was in its earliest manifestation (1981) a way of conveying celebratory pleasure to a teammate, rather than a means of avoiding hand-touching when greeting  a friend, colleague, or stranger.

 New and previously unfamiliar abbreviations have also taken their place in our everyday vocabulary, and these too appear in the latest OED release. While WFH (working from home) dates from 1995 as mentioned previously, the abbreviation was known to very few before it became a way of life for so many of us. PPE is now almost universally recognized as personal protective (or protection) equipment—an abbreviation dating from 1977 but formerly probably restricted to healthcare and emergency professionals. The full phrase – personal protective equipment – dates from as far back as 1934.

As a historical dictionary, the OED is already full of words that show us how our forebears grappled linguistically with the epidemics they witnessed and experienced. The earliest of these appeared in the late fourteenth and fifteenth centuries, when the great plague of 1347-50 and its follow-ups, which killed an estimated 40-60 per cent of the population of Europe, must surely have been an ever-present memory and fear. Pestilence, ‘a fatal epidemic or disease’, was borrowed from French and Latin, and first appears in Wycliffe’s bible of a1382, not long after this first great devastation. The related term pest (from French peste) appeared shortly afterwards. Our weakened uses of pest—an insect that infects crops, an annoying person—stem from this original plague usage. Pox (from the plural of pock, denoting a pustule or the mark it leaves) appeared in 1476 as a term applied to a number of virulently contagious diseases, most especially the dreaded smallpox (first recorded in the 1560s).

 It was the great plagues of the seventeenth century, however, that opened the floodgates for the entry into English of words to describe the experience of epidemic disease. Epidemic and pandemic both appeared in the seventeenth century; the Black Plague (so called from the black pustules that appeared on the skin of the victims) was first used in the early 1600s (although its more familiar synonym Black Death, surprisingly, did not appear until 1755). It was the seventeenth-century plague that saw a whole village in Derbyshire choose to self-isolate or self-quarantine; the adjective self-quarantined was first applied, in a historical description from 1878, to the story of the heroic population of Eyam, which isolated itself in 1665-6 to avoid infecting the surrounding villages, and lost around a third of its population as a consequence.

As the world expanded, so too did the spread of diseases and their vocabulary. Yellow fever appeared in 1738, and the so-called Spanish influenza in 1890 (reduced to Spanish flu during the great epidemic of 1918). Poliomyelitis appeared in 1878 (shortened to polio in 1911), although the epidemic that attacked children especially and struck fear into the heart of parents was at its worst just after WWII. Recent decades have also seen their share of linguistic coinages for epidemics and pandemics. AIDS (acquired immune deficiency syndrome) appeared in 1982, and SARS (severe acute respiratory syndrome) in 2003. The coronaviruses themselves (so-called because they resemble the solar corona) were first described as long ago as 1968 in a paper in Nature, but before 2020 few people had heard of the term beyond the scientists studying them.

As we continue to monitor our in-house corpora and other language data to spot new words and senses associated with the pandemic and assess the frequency of their usage, the OED will keep updating its coverage to help tell the story of these times that will inevitably become embedded in our language.

The post Social change and linguistic change: the language of Covid-19 appeared first on Oxford English Dictionary.

Read the whole story
cjmcnamara
42 days ago
reply
Share this story
Delete

Contact Tracing COVID-19 Infections via Smartphone Apps

2 Shares

Google and Apple have announced a joint project to create a privacy-preserving COVID-19 contact tracing app. (Details, such as we have them, are here.) It's similar to the app being developed at MIT, and similar to others being described and developed elsewhere.

I was going to write a long essay about the privacy and other concerns, but Ross Anderson beat me to it. (Note that some of his comments are UK-specific.)

First, it isn't anonymous. Covid-19 is a notifiable disease so a doctor who diagnoses you must inform the public health authorities, and if they have the bandwidth they call you and ask who you've been in contact with. They then call your contacts in turn. It's not about consent or anonymity, so much as being persuasive and having a good bedside manner.

I'm relaxed about doing all this under emergency public-health powers, since this will make it harder for intrusive systems to persist after the pandemic than if they have some privacy theater that can be used to argue that the whizzy new medi-panopticon is legal enough to be kept running.

Second, contact tracers have access to all sorts of other data such as public transport ticketing and credit-card records. This is how a contact tracer in Singapore is able to phone you and tell you that the taxi driver who took you yesterday from Orchard Road to Raffles has reported sick, so please put on a mask right now and go straight home. This must be controlled; Taiwan lets public-health staff access such material in emergencies only.

Third, you can't wait for diagnoses. In the UK, you only get a test if you're a VIP or if you get admitted to hospital. Even so the results take 1-3 days to come back. While the VIPs share their status on twitter or facebook, the other diagnosed patients are often too sick to operate their phones.

Fourth, the public health authorities need geographical data for purposes other than contact tracing - such as to tell the army where to build more field hospitals, and to plan shipments of scarce personal protective equipment. There are already apps that do symptom tracking but more would be better. So the UK app will ask for the first three characters of your postcode, which is about enough to locate which hospital you'd end up in.

Fifth, although the cryptographers - and now Google and Apple - are discussing more anonymous variants of the Singapore app, that’s not the problem. Anyone who's worked on abuse will instantly realise that a voluntary app operated by anonymous actors is wide open to trolling. The performance art people will tie a phone to a dog and let it run around the park; the Russians will use the app to run service-denial attacks and spread panic; and little Johnny will self-report symptoms to get the whole school sent home.

I recommend reading his essay in full. Also worth reading are this EFF essay, and this ACLU white paper.

To me, the real problems aren't around privacy and security. The efficacy of any app-based contact tracing is still unproven. A "contact" from the point of view of an app isn't the same as an epidemiological contact. And the ratio of infections to contacts is high. We would have to deal with the false positives (being close to someone else, but separated by a partition or other barrier) and the false negatives (not being close to someone else, but contracting the disease through a mutually touched object). And without cheap, fast, and accurate testing, the information from any of these apps isn't very useful. So I agree with Ross that this is primarily an exercise in that false syllogism: Something must be done. This is something. Therefore, we must do it.

Read the whole story
cjmcnamara
45 days ago
reply
Share this story
Delete

Running Alone Together

1 Share
I find the blanket hostility toward runners disheartening, but people have grounds for fury when there definitely are runners who do not give good warning, or who do not run at a safe distance, or who spit. And people are scared, so finding scapegoats always seems to help with that. This is particularly true when outside space is either overcrowded or threatened. All the vitriol for runners here comes from densely populated cities. Yet some city authorities, from London to Los Angeles, have closed parks or trails. Where are people to go?
Read the whole story
cjmcnamara
48 days ago
reply
Share this story
Delete

Fran Lebowitz Is Never Leaving New York

1 Comment
Michel Schulman interviews the writer Fran Lebowitz about growing old, life in quarantine, missing her friend Toni Morrison, and the sadness of seeing her city shut down during the coronavirus pandemic.
Read the whole story
cjmcnamara
49 days ago
reply
great from start to finish
Share this story
Delete

Admissions, acceptances, and the possibly on-line fall.

1 Comment and 2 Shares

Few colleges are talking opening about what instruction will look like in the Fall, and my prediction is that it will be a while before they do. There is an elephant in the room, which college administrators are well aware of, but most college faculty and the general public are oblivious to.

Here’s what we are all aware of. A decision about whether to continue with ‘alternative’ delivery (i.e., online teaching) in the fall may affect acceptance rates for selective colleges. A student may have her heart set on attending College X, but probably her heart is set on actually being there in person, and if she thinks that her first semester there will be online she may well choose, instead, to go to College Y, which also seems pretty good, if she thinks that College Y will be in person. (For simplicity’s sake I am ignoring the possibility that sophomores etc might decide just to skip a semester or a year, if we stay online in the Fall—that possibility matters a lot for the financial stability of the institutions, but not for what I am going to tell you). So, assuming that we are allowed to make choices about whether or not to be open in-person, there will be huge pressure to go in-person.

Here’s the complication.

I would guess that some of you believe, wrongly, that when you commit to attending a college on May 1st (or June 1st, if that is what it ends up being this year) you are making a commitment that is at least in some sense binding. In fact, as others of you, that is not the case—if you change your mind, you just lose your deposit. It feels like binding commitment because selective colleges abide by the National Association for College Admission Counseling (NACAC) code of ethics, which has long included provisions prohibiting attempts to poach students who have already committed to another college. So—after May 1st (or, if it changes, June 1st), no college will initiate communication with you if you have already committed to another college.

This year, for reasons that have nothing to do with COVID, that will change. In September 2019, responding to intense pressure from the Justice Department, NACAC removed those provisions from its code of ethics. The provisions that were stripped from the code are:


“Colleges must not offer incentives exclusive to students applying or admitted under an early decision application plan. Examples of incentives include the promise of special housing, enhanced financial aid packages, and special scholarships for early decision admits. Colleges may, however, disclose how admission rates for early decision differ from those for other admission plans.”

“College choices should be informed, well-considered, and free from coercion. Students require a reasonable amount of time to identify their college choices; complete applications for admission, financial aid, and scholarships; and decide which offer of admission to accept. Once students have committed themselves to a college, other colleges must respect that choice and cease recruiting them.”

“Colleges will not knowingly recruit or offer enrollment incentives to students who are already enrolled, registered, have declared their intent, or submitted contractual deposits to other institutions. May 1 is the point at which commitments to enroll become final, and colleges must respect that. The recognized exceptions are when students are admitted from a wait list, students initiate inquiries themselves, or cooperation is sought by institutions that provide transfer programs.”

“Colleges must not solicit transfer applications from a previous year’s applicant or prospect pool unless the students have themselves initiated a transfer inquiry or the college has verified prior to contacting the students that they are either enrolled at a college that allows transfer recruitment from other colleges or are not currently enrolled in a college.” [1]

Already, before COVID-19, college leaders were preparing for what this would mean. I talked with a number of college presidents, VPs of enrollment management, and provosts earlier this year, and they were all very anxious about it: one president of a small liberal arts college expressed the view that it would be extremely costly, and would result in several colleges closing even after the first year. Every college is now much more financially precarious than they were at the beginning of this year, and I can only imagine that their anxiety about what happens after May/June 1st is heightened. Suppose that it gets to June 15th, and your college, which has gotten exactly the number of acceptances it aimed for, starts signalling that the first semester—or even just the first part of the first semester—might be online. You are immediately a target for poachers; and whoever can sound most committed to in-person teaching has the best chance of winning.

Of course, colleges have limited control over whether they actually open in-person in the Fall. They all have an incentive, already, to pressure state authorities to allow them to stay open, regardless. But the COVID crisis is an invitation to even more chaos than administrators expected when this decision was made, and the decision will inhibit straightforward an honest deliberation about what to do, and will make planning even more difficult.

Who knows, maybe summer will come and we’ll go back to some semblance of normality.

[1] Here’s a useful article reporting the NACAC decision.

Read the whole story
cjmcnamara
53 days ago
reply
Share this story
Delete
1 public comment
istoner
53 days ago
reply
This specific poaching free-for-all will only affect the elite schools, but I hadn't previously thought about similar dynamics at every level of institution.

If the public community colleges and tech schools were to stay online-only in the fall for public health reasons, the scammy for-profits will undoubtedly pitch hard for their students. That's going to make for huge pressure on everyone to open up in August, even though that looks unlikely to be safe.

COVID-19 reveals SO MANY ways we have chosen to make for ourselves a shithole of a country.
Saint Paul, MN, USA
Next Page of Stories