What’s In A Black Name? Ctd

Readers ramp up the thread:

About the discussion on discriminating against black names, there is also this research, where the researchers sent out emails to professors from many disciplines seeking help/information about their PhD programs. The emails were signed by generic White, Black, Hispanic, Indian, and Chinese male/female names. Then they measured the response rate: how fast the professors responded and how willing they were to help the student. Regardless of the professors’ discipline, sex, race, White males had it the best, and the Asians the worst. This only changed with Chinese professors responding to Chinese students. So, there you go.

A few readers also point to a study entitled “Are Emily and Greg More Employable Than Lakisha and Jamal?” The abstract conclusion states, “White names receive 50 percent more callbacks for interviews.” Another redirects to the real world:

I read your post and thought perhaps employers need to institute anonymous application processes. A quick search online revealed that some organizations (and governments) have done just that with positive results.

Here is a link to a recent article summarizing the efforts of the City of Celle, which is in Germany. I find it encouraging that the data suggests that hiring process becomes fairer when applications are judged only on their work histories, levels of education, skills, and accomplishments. Seems like a strategy we could all embrace, does it not?

Another makes a broader point:

I think there’s something off about the post about how employers are making “not an entirely unreasonable assumption” when discounting the educational achievements of black applicants. The evidence presented is that of SAT scores, that black students scored less on average than the mean scores accepted to universities. The implication being, universities are accepting less-than-optimal students for the sake of affirmative action. And, ok, there may be some argument or debate there.

However, the purpose of SAT scores are as a predictive measure of how students will do in college. Beyond the college application, they have absolutely no relevance. Why not? Affirmative action as a policy may help students with lower SAT scores get into college, but it doesn’t go to class for them, or write their papers, or take their final exams. Graduating college is an achievement that is mostly up to the student. So, a college degree on a resume stands on its own. My point is, affirmative action is actually not a plausible explanation for discrimination, at least not where education is concerned. If it was, then employers would ask for SAT scores on resumes. Guess what? They don’t.

Another shares an anecdote:

Many years ago I was a state civil rights investigator.  I remember one college-educated long-time head teller at a bank who was finally upset enough about the young white guys with no college and no experience heading straight into management tracks that she filed a complaint.  I talked to the branch manager about it.  He told me that the young guy he’d just hired reminded him of himself when he was just starting out and he wanted to give the new guy the same chance someone had given him.  He would have sworn on a stack of Bibles that he didn’t have a racist bone in his body.  He thought the world of his head teller.  He relied on her to train all the new staff.  But in his unconscious world view, black women were tellers, white men were management.

Another attests to the often fickle nature of hiring:

I’m a hiring manager at a software company. I receive dozens or hundreds of applications for every open position. Generally I skip cover letters altogether and spend somewhere between 20 and 30 seconds scanning resumes to come to an initial decision of whether to reject or follow-up. Even if there weren’t all sorts of academic studies telling us so, it’s pretty freaking clear to any even marginally self-aware person that this process is fraught with implicit biases. Have I heard of your college? Did I go to your college? Do I wonder whether you’ll “fit in” with our other employees? Do I want to drink beer with you? Is your job experience as impressive as it seems, or did you benefit from being a “diversity hire?”

This stuff sucks, but it’s the reality of the hiring process in good old meritocratic Silicon Valley. Other industries are probably even worse.

Update from a reader, who quotes a previous one:

So, a college degree on a resume stands on its own. My point is, affirmative action is actually not a plausible explanation for discrimination, at least not where education is concerned. If it was, then employers would ask for SAT scores on resumes. Guess what? They don’t.

ummm … actually … yes they do!

Big-name consulting firms such as McKinsey and Bain, as well as banks like Goldman Sachs, are among the companies that ask newly minted college grads for their scores in job applications, …Some other companies request scores even from candidates in their 40s and 50s. Jeff Bezos, the billionaire founder and CEO of Amazon, is one of the most famous proponents of using SAT scores in hiring decisions. Bezos scored highly on a standardized IQ test when he was only 8 years old, and in his early days as a manager, he liked to ask candidates for their SAT results in interviews he conducted. He has said that “hiring only the best and brightest was key to Amazon’s success.”

The Hierarchy Of Innovation

hierarchy of innovation

Nick Carr theorizes that there “has been no decline in innovation; there has just been a shift in its focus.” His idea is that “there’s a hierarchy of innovation that runs in parallel with Abraham Maslow’s famous hierarchy of needs”:

If progress is shaped by human needs, then general shifts in needs would also bring shifts in the nature of technological innovation. The tools we invent would move through the hierarchy of needs, from tools that help safeguard our bodies on up to tools that allow us to modify our internal states, from tools of survival to tools of the self. … The focus, or emphasis, of innovation moves up through five stages, propelled by shifts in the needs we seek to fulfill. In the beginning come Technologies of Survival (think fire), then Technologies of Social Organization (think cathedral), then Technologies of Prosperity (think steam engine), then technologies of leisure (think TV), and finally Technologies of the Self (think Facebook, or Prozac).

As with Maslow’s hierarchy, you shouldn’t look at my hierarchy as a rigid one. Innovation today continues at all five levels. But the rewards, both monetary and reputational, are greatest at the highest level (Technologies of the Self), which has the effect of shunting investment, attention, and activity in that direction.

Canine Competitors

A reader writes:

The video you posted of the agility trial is a good example of a local, hobbyist level event. I play agility with my two dogs at this level, competing for titles and ribbons. There is, however, a competitive sports level of dog agility, and it’s a different beast altogether. The handlers are athletes themselves, conditioning and sprint training to stay ahead of their four-legged teammates to give timely and clear cues. The dog doesn’t know the course obviously, and the handler has eight minutes of walk-through before the runs begin, to walk the course, memorize it, and decide on a handling strategy. The teams that place and make it onto the podium, are winning by tenths of a second. Imagine outrunning a border collie, and you can start to appreciate the level of sportsmanship involved.

On the video seen above:

Your readers might enjoy watching an agility superstar like Lisa Frick and her dog, Hoss, win Gold at the FCI Agility World Championships, the oldest, toughest, and most prestigious international competition, the Olympics of dog agility.

To Screen But Not To See

Iraq veteran Brian Turner, author of the memoir My Life as a Foreign Country, has an insightful take on American Sniper and how the film reflects our flawed understanding of the conflict:

This isn’t the defining film of the Iraq War. After nearly a quarter century of war and occupation in Iraq, we still haven’t seen that film. I’m beginning to think we’re incapable as a nation of producing a film of that magnitude, one that would explore the civilian experience of war, one that might begin to approach so vast and profound a repository of knowledge. I’m more and more certain that, if such a film film ever arrives, it’ll be made by Iraqi filmmakers a decade or more from now, and it’ll be little known or viewed, if at all, on our shores. The children of Iraq have far more to teach me about the war I fought in than any film I’ve yet seen — and I hope some of those children have the courage and opportunity to share their lessons onscreen. If this film I can only vaguely imagine is ever made, it certainly won’t gross $100 million on its opening weekend.

The biggest problem I have with American Sniper is also a problem I have with myself.

It’s a problem I sometimes find in my own work, and it’s an American problem: We don’t see, or even try to see, actual Iraqi people. We lack the empathy necessary to see them as fully human. In American Sniper, Iraqi men, women, and children are known and defined only in relation to combat and the potential threat they pose. Their bodies are the site and source of violence. In both the film and our collective imagination, their humanity is reduced in ways that, ultimately, define our own narrow humanity. In American Sniper, Iraqis are called “savages,” and the “streets are crawling” with them. Eastwood and his screenwriter Jason Hall give Iraqis no memorable lines. Their interior lives are a blank canvas, with no access points to let us in. I get why that is: If Iraqis are seen in any other light, if their humanity is recognized, then the construct of our imagination, the ride-off-into-the-sunset-on-a-white-horse story we tell ourselves to push forward, falls apart.

If we saw Iraqis as humans, we’d have to learn how to live in a world far, far more complicated and painful than the difficult, painful one we currently live in.

Get High, Lower Rape?

Annie Lowrey floats the idea:

According to government research, every year, 97,000 students are “victims of alcohol-related sexual assault or date rape,” with alcohol consumption having a profound effect on perpetrators’ behavior. … [T]here is some evidence that young people tend to substitute pot for alcohol. They either burn one down or chug one down; more pot means less beer. And there’s also evidence that increasing the price of beer nudges young people to switch to pot — with some significant effects, including lower rates of violent crime. “Alcohol is clearly the drug with the most evidence to support a direct intoxication-violence relationship,” according to one paper in the journal Addictive Behaviors. “Cannabis reduces likelihood of violence during intoxication.”

Robby Soave applauds Lowrey, calling her approach “reverse reefer madness.”

A Mini Military-Industrial-Complex, Ctd

Ronald Bailey flags a disquieting item:

Some 50 police agencies including U.S. marshals and the FBI have been using for two years Range-R doppler radar devices that can see 50 feet through walls, including brick and concrete ones, to detect the location of people inside their houses. And in some cases law enforcement officers are using with them without search warrants. As USA Today reports:

The radars were first designed for use in Iraq and Afghanistan. They represent the latest example of battlefield technology finding its way home to civilian policing and bringing complex legal questions with it….

Previous Dish on that trend here. Maria Santos notes regarding the radar device:

One appeals court case tackled the technology in December after police used the radar to arrest a man violating his parole. The court ultimately upheld law enforcement’s actions, but judges remarked that “the government’s warrantless use of such a powerful tool to search inside homes poses grave Fourth Amendment questions.” Justice Department officials are now reviewing the case.

Faces Of The Day

ARGENTINA-ISRAEL-IRAN-AMIA-ATTACK-JEWS-PROSECUTOR-NISMAN-DEMO

People demonstrate in front of the National Congress in Buenos Aires against the death of Argentine public prosecutor Alberto Nisman, who was found shot dead this weekend, just days after accusing President Cristina Kirchner of obstructing a probe into a 1994 Jewish center bombing. Nisman, 51, who was just hours away from testifying at a congressional hearing, was found dead overnight on Sunday in his apartment in the trendy Puerto Madero neighbourhood. “I can confirm that a .22-caliber handgun was found beside the body,” prosecutor Viviana Fein said. The nation’s top security official said Nisman appears to have committed suicide. By Alejandro Pagni/AFP/Getty Images. The latest:

Confronted with a deepening scandal, the president of Argentina abruptly reversed herself on Thursday, saying that the death of the lead prosecutor investigating the 1994 bombing of a Jewish center was not a suicide as she and other government officials had suggested. The president, Cristina Fernández de Kirchner, instead suggested that the prosecutor’s death was part of what she hinted was a sinister plot to defame and destroy her. … Mrs. Kirchner has not appeared in public since Mr. Nisman’s death, and her vice-president has been visiting Bolivia. She posted a link to her latest remarks on Twitter. When she first wrote about Mr. Nisman on Tuesday, it was on her Facebook page, leading Mr. de Narváez to call her leadership “adolescent.”

What’s Killing The Bees? Ctd

The crisis could be spreading:

Wild bees are at risk of catching diseases from their struggling domesticated brethren, according to a recent study published in The Journal of Applied Ecology.

The study, led by evolutionary geneticist Lena Wilfert of the University of Exeter, adds a new layer to the crisis known as colony collapse disorder (CCD), which has precipitated an alarming drop in honeybee populations worldwide. … [T]he use of pesticides isn’t the only anthropogenic driver behind the transmission of pathogens into the wild. Wilfert’s team also found that many commercial beekeepers are creating ideal conditions for virulent diseases to emerge. “High densities within breeding facilities and in commercial pollination operations increase the contact rate between infected and uninfected conspecifics, thereby lowering the threshold for disease emergence,” the authors explain.

Helen Briggs reports from across the Pond:

Vanessa Amaral-Rogers of the charity, Buglife, said the results of the study showed an urgent need for changes in how the government regulates the importation of bees. “Wild honey bees can no longer be found in England or Wales, thought to have been wiped out by disease,” she told BBC News. “Now these studies show how diseases can be transmitted between managed honey bees and commercial bumble bees, and could have potentially drastic impacts on the rest of our wild pollinators.”

A study last year on a sample of commercial bumble bee hives imported into the UK found 77% were contaminated with up to five different parasites, with a further three being found in the pollen that was brought in with them, she added.

More Dish on the bee problem here and here.

How Can We Beef Up Cyber Security?

Adam Segal sees new cyber laws as a real possibility:

Could this finally be the year when the Congress passes cyber legislation? I think yes. Public awareness of the threat is at an all-time high. The Sony attack has created pressure for Congress to act (though it is not clear that any of the legislation would have prevented the North Korean hackers from breaching the company). Moreover, there is bipartisan support for cybersecurity legislation. … [W]hile disparaging most of the President’s agenda, prominent Republicans like Senator Lamar Alexander of Tennessee have pointed to cybersecurity as an area where “we can get some agreement.” As in the past, privacy concerns will make or break the legislation, but we should expect to see real signs of progress.

Katie Benner examines the cyber proposals in Obama’s SOTU:

The Obama ideas with the most potential to bolster corporate security are his threat-sharing measure and the corporate disclosure rule.

As I’ve written before, collaboration is considered to be one of the best defenses against cybercrime, but a recent PricewaterhouseCoopers survey found that only 25 percent of businesses currently share information about attacks. Obama wants to encourage companies to share threat data with the government in order to get liability protection. … The disclosure rule isn’t useful because it increases security per se, but because it gives companies an incentive to pre-emptively beef up their defenses.

However, Timothy Edgar declares that no “proposal in Obama’s State of the Union address would truly hold companies accountable for cyber insecurity”:

If you are looking for effective ideas on this score, you would do better to listen to students here at Brown University where I’ve lately been teaching.

One student’s idea was to build on existing “bug bounty” programs in which software companies pay researchers money for uncovering security flaws by turning the federal hacking law on its head. Today, all intrusions—even “white hat” penetrations for security research—are illegal unless the system owner consents. A company with lousy security may threaten a security researcher with a lawsuit or jail time for pointing out a gaping hole in its defenses. What if Congress reversed this perverse law, requiring companies to pay ethical hackers for demonstrating vulnerabilities?