Doing The Right Thing At Rikers

John Surico relays some actual good news out of the Big Apple when it comes criminal justice:

New York City’s Rikers Island, the second-largest jail in America, has been making headlines for the worst possible reasons lately. Between the ritualized beating of teenage inmates and horrendous treatment of mentally ill prisoners, it’s no wonder the feds are suing NYC over civil rights violations at the sprawling complex.

So it came as something of a surprise when, on Tuesday morning, New York City officials announced that, starting next year, Rikers Island will no longer hold anyone in solitary confinement under the age of 21, making it the first jail of its kind to do so in the country. Furthermore, no one—regardless of age—will be allowed to suffer in solitary for more than 30 consecutive days, and a separate housing unit will be established for those inmates most prone to violence.

Suffice it to say this is a big deal: Rikers Island just went from being one of the most dysfunctional jails in America to tentatively claiming a spot at the vanguard of the emerging criminal justice reform movement.

Major props to Mayor de Blasio, who last March appointed a new jails commissioner who promised to “end the culture of excessive solitary confinement.”  But not all the news from Rikers was good this week:

A clean criminal record is not a requirement for a career as a Rikers correction officer, the Times reports. Got a gang affiliation and multiple arrests? No problem! Deep-seated psychological issues? Can’t hurt to apply! The Department of Investigation reviewed the Correction Department’s hiring process and discovered “profound dysfunction.” …

This administrative horrorshow does go a long way toward explaining why Rikers has found itself embroiled in so many awful situations, which include but are not limited to the inmate who was beaten and sodomized by a Correction Officer, the teen who died in solitary from a tear in his aorta, and the mentally ill man who died in solitary after guards declined to check on him.

Indeed, violence against prisoners has been at a record high:

Though the de Blasio administration promised reforms and a federal prosecutor’s report brought new attention to a “deep-seated culture of violence” at Rikers Island, in 2014 New York City correction officers set a record for the use of force against inmates. According to the Associated Press, guards reported using force 4,074 times last year, up from 3,285 times the previous year.

Back to the issue of solitary, it’s worth revisiting Jennifer Gonnerman’s great piece on Kalief Browder, who was sent to Rikers at 16 after being accused, but never convicted, of taking a backpack. He ended up spending three years there, most of it in solitary. A snapshot:

[Browder] recalls that he got sent [to solitary] initially after another fight. (Once an inmate is in solitary, further minor infractions can extend his stay.) When Browder first went to Rikers, his brother had advised him to get himself sent to solitary whenever he felt at risk from other inmates. “I told him, ‘When you get into a house and you don’t feel safe, do whatever you have to to get out,’ ” the brother said. “ ‘It’s better than coming home with a slice on your face.’ ”

Even in solitary, however, violence was a threat. Verbal spats with officers could escalate. At one point, Browder said, “I had words with a correction officer, and he told me he wanted to fight. That was his way of handling it.” He’d already seen the officer challenge other inmates to fights in the shower, where there are no surveillance cameras.

Browder later tried to commit suicide by slitting his wrists with shards of a broken bucket. Also from Gonnerman’s piece:

Between 2007 and mid-2013, the total number of solitary-confinement beds on Rikers increased by more than sixty per cent, and a report last fall found that nearly twenty-seven per cent of the adolescent inmates were in solitary. “I think the department became severely addicted to solitary confinement,” Daniel Selling, who served as the executive director of mental health for New York City’s jails, told me in April; he had quit his job two weeks earlier. “It’s a way to control an environment that feels out of control—lock people in their cell,” he said. “Adolescents can’t handle it. Nobody could handle that.”

Another story of a teen in solitary is seen in the above video. Previous Dish on the inhumane practice here.

To Protect And Bomb

This week ProPublica reported on law enforcement’s overuse of flashbang grenades:

Police argue that flashbangs save lives because they stun criminals who might otherwise shoot. But flashbangs have also severed hands and fingers, induced heart attacks, burned down homes and killed pets. A ProPublica investigation has found that at least 50 Americans, including police officers, have been seriously injured, maimed or killed by flashbangs since 2000. That is likely a fraction of the total since there are few records kept on flashbang deployment.

The U.S. Court of Appeals for the 7th Circuit wrote in 2000 that “police cannot automatically throw bombs into drug dealers’ houses, even if the bomb goes by the euphemism ‘flash-bang device.’” In practice, however, there are few checks on officers who want to use them. Once a police department registers its inventory with the Bureau of Alcohol, Tobacco, Firearms and Explosives, it is accountable only to itself for how it uses the stockpile. ProPublica’s review of flashbang injuries found no criminal convictions against police officers who injured citizens with the devices.

The article goes on to detail several incidents of flashbang misuse. Sullum comments on one of them:

One of those Little Rock raids involved a grandmother accused of illegally selling beer and food from her home—a misdemeanor that triggered the use of a battering ram and a flash-bang, which started a fire in a pile of clothing. “If she hadn’t been selling illegal items out of the home, no warrant would have been served,” the police spokesman told ProPublica. “What you call extreme, we call safe.”

To my mind, tossing explosive devices into the homes of nonviolent offenders is decidedly unsafe; in fact, it is inherently reckless, especially since there is always the possibility that innocent bystanders like Treneshia Dukes or Bou Bou Phonesavan will be injured or killed. Even if the police had found the drugs they were expecting in those cases, that would hardly justify their paramilitary assaults. Flash-bangs, like SWAT tactics generally, should be reserved for life-threatening situations involving hostages, barricaded shooters, and the like. They should not be casually tossed into the mix of techniques for busting unauthorized sellers of bud or beer.

Update from a reader:

To make matters worse in the case of Bou Bou, the county is refusing to pay the $1 million+ in medical bills that the family now has and will continue to have as Bou Bou will need at least 10 more reconstructive surgeries over the next 20 years. The county says that it would be illegal for them to pay the bills as state law grants them sovereign immunity from negligence claims, so any money they would pay the family would constitute an illegal “gratuity”. It’s unconscionable that the police can nearly kill this kid with their reckless and overzealous tactics and just walk away without having to pay a dime to the family they just financially destroyed.

The Best Of The Dish Today

And so we get a little warmer:

Central Intelligence Agency Director John Brennan consulted the White House before directing agency personnel to sift through a walled-off computer drive being used by the Senate Intelligence Committee to construct its investigation of the agency’s torture program, according to a recently released report by the CIA’s Office of the Inspector General. The Inspector General’s report, which was completed in July but only released by the agency on Wednesday, reveals that Brennan spoke with White House chief of staff Denis McDonough before ordering CIA employees to “use whatever means necessary” to determine how certain sensitive internal documents had wound up in Senate investigators’ hands.

So at the very least, the White House was aware of the CIA’s intent to spy on the Senate staffers, as part, one can only presume, of its campaign to shield the CIA from any accountability on torture and to undermine as much as possible the SSCI report. Then we have this blather from Josh Earnest today:

Reporter: Was it acceptable for the CIA to go into these computers and look at what, you know, at what the senate committee was doing? Senator Feinstein was very, very upset about this. Does the White House have a position about whether the CIA acted properly or not?

Josh Earnest: What is most important is we have a group of individuals within area of expertise who can sit down and take an impartial look at all the facts and determine exactly what happened to offer up some prescriptions for what can be changed to ensure that any sort of miscommunication or anything that would interfere with the ability of congress to conduct proper oversight of the CIA is avoided in the future. You can read the report. It’s been declassified and released. And they included a set of procedural reforms they believe will be helpful in avoiding any of these disagreements in the future. Fortunately, the director of the CIA said he’d implement these reforms. What’s most important is that there’s an effective relationship between the intelligence agencies and the committees that oversee them.

My favorite line in that pabulum is: “Fortunately, the director of the CIA said he’d implement these reforms.” Fortunately. It’s not as if the president could simply order him to do so. He must ask very, very politely; and if he’s very, very lucky, the CIA Director may occasionally agree.

Some posts today worth revisiting: why women cry more than men; watching Charlie self-censorship in Britain on live TV; the burgeoning Republican consensus that 11 million people need to be forcibly deported from the US; the miracle of the latest Google Translate app; and how Obama’s Gallup approval ratings at this point in his second term are identical to Reagan’s.

The most popular post of the day was Charlie, Blasphemer; followed by Quotes For The Day.

We’ve updated many recent posts with your emails – read them all here. You can always leave your unfiltered comments at our Facebook page and @dishfeed. 19 more readers became subscribers today. You can join them here – and get access to all the readons and Deep Dish – for a little as $1.99 month. Gift subscriptions are available here. Dish t-shirts are for sale here and our coffee mugs here. One happy recipient from the holidays:

A few weeks ago I wrote and said “fuggedaboudit!” because of the shipping and handling cost of the DishMugDish mug and I said I wouldn’t buy one.

On Christmas Day I was shocked, I tell you, shocked! When I opened a present from my thoughtful wife there was a Dish mug! She took it into her own hands to send for it and I am so tickled and happy to get it! They are great and it hasn’t left my hand since I took it out of the box. All Christmas Day it was filled with coffee, tea, eggnog, or wassail punch.

They are superb, and I say to everybody: just do it and GET ONE!

You can do so here – but soon, since there are only a small number of mugs remaining from our limited run (we opted for the higher-quality mugs over the print-on-demand ones).

See you in the morning.

Was Selma Really Snubbed?

Another year, another diversity headcount for the Oscars. But Linda Holmes does note a striking consistency among this year’s nominees:

Even for the Oscars — even for the Oscars — this is a really, really lot of white people. Every nominated actor in Lead and Supporting categories — 20 actors in all — is white.

Every nominated director is male. Every nominated screenwriter is male. Shall we look at story? Every Best Picture nominee here is predominantly about a man or a couple of men, and seven of the eight are about white men, several of whom have similar sort of “complicated genius” profiles, whether they’re real or fictional.

She wishes Selma director Ava DuVernay had been nominated:

[This] is a disappointment not only for those who admired the film and her careful work behind the camera, but also for those who see her as a figure of hope, considering how rare it is for even films about civil rights to have black directors, and how rare it is for any high-profile project at all to be directed by a woman. Scarcity of opportunity tends to breed much lower tolerance for the whimsical sense that nominations normally have, so that even people who know better than to take Oscar voting to heart feel the sting of what seems like a deliberate snub.

Selma did get two nominations, for Best Original Song and Best Picture. But that’s still a snub to many:

However, Joe Reid points out that Selma‘s production schedule greatly hindered the nomination process: “The film wasn’t completed until late November, which left it too late for distributor Paramount to send out screener DVDs to voters in most awards organizations, including the Screen Actors Guild, Producers Guild, and Directors Guild.” Tim Gray goes into greater detail:

[S]creeners were sent to BAFTA and [Academy] voters Dec. 18, after Par [Paramount] paid a premium for sped-up service, since it usually takes up to six weeks to prep DVDs for voters. … In theory, Par could have met final deadlines for some of the guilds, but instead it focused on in-person screenings in New York and Los Angeles, since the studio had already missed some guild deadlines.

Mark Harris also notes that less sexy factor of screener scheduling:

As I hope a lot of companies are realizing this morning, it is just about always a mistake to release a serious Oscar movie in the second half of December. Yes, American Sniper did very well today, with half a dozen nominations to take into its first weekend of national release. But it’s the only one of the eight Best Picture choices this year to open after December 1 besides Selma.

Joe Concha sighs:

So please, can we save the racist rhetoric for a more worthy discussion?

Selma — I would submit a better movie overall than 12 Years a Slave [which was nominated for 9 Oscars and won Best Picture, Supporting Actress, and Adapted Screenplay] — was still nominated for Best Picture. It may very well win (Vegas odds coming shortly). Sometimes (often) the Academy screws up with the nominations, and I’ve still never forgiven them for selecting Dances with Wolves over Goodfellas for Best Picture or Forrest Gump over Shawshank. But for credible newsmen and publications to call out the Academy’s skin color as the defining factor in what and who gets nominated is the kind of dialogue that cheapens the argument.

Freddie is frustrated as well:

Selma has already rapidly become one of those artistic objects that our chattering class will not allow to exist simply as art, and instead is used as a cudgel with which to beat each other over various petty ideological sins. … [O]ur media is filled with people who presume to speak for those who lack privilege but who enjoy it themselves, racial and economic privilege. The difference in stakes between those who suffer under racism and classism and most of those who just write about them distorts the conversation over and over again. Which leads to things like last year, where people preemptively complained about the racism inherent in 12 Years a Slave not winning, whining about American Hustle and white privilege, and then actually seemed disappointed when 12 Years did win. You know you’re a privileged person when the fun of complaining about injustice outweighs the pleasure of a just outcome.

Could there be a national conversation the various issues playing out here that was edifying, smart, and meaningful? Sure. Will there be? Hahahaha, no! There’s tons of important things to be said about the relationship between art and politics, about the continuing racism of Hollywood, about what it means to be universal in the way that Boyhood is frequently praised for (and largely black films usually aren’t), about what it means to be Oscar-bait in the 21st century…. But I can pretty much guarantee you that we won’t have an effective conversation about any of it, because lately our whole apparatus seems broken.

By the way, it’s worth recalling another big controversy over a critically acclaimed film getting “snubbed” at the Oscars, in 2006, because of perceived prejudice:

West Coast critics seemed to favor Crash, a movie about Los Angeles, whose characters spent a lot of their time in their cars. A massive ensemble piece that seemed to employ half the Screen Actors Guild (no wonder actors who were Academy members liked it), it purported to make a grand statement on the still-troubling issue of Racism: It infects everybody. East Coast critics, however, found Crash‘s racial politics simplistic and its plotting too full of programmatic twists and coincidences (nearly every character is revealed to be something other than the hero or heel he or she seems at first.)

Instead, they favored Brokeback Mountain, which deconstructed cherished Western archetypes about cowboys, machismo, and rugged individualism in order to tell mainstream Hollywood’s first gay love story. And while director Ang Lee won an Oscar for his sensitive handling of the material, its three principal actors were snubbed (a particularly galling omission in the case of Heath Ledger, whose breakthrough performance turned out to be the last opportunity to give him a trophy while he was alive, and whose posthumous prize for The Dark Knight is often considered a consolation prize for his being passed over here).

And the Angelenos who make up the bulk of the Academy gave Best Picture to Crash.

The Politics Of “Fertility Fog” Ctd

Several more women tell their stories:

Ugh, that last letter you published, which included this phrase “I, and many of my Ivy League, high-income friends” was HORRIBLE. Here’s what I have to say: Ladies, don’t be so selfish. If you choose to focus on yourself (and there is nothing wrong with that) for the first 20 years of your adult life, don’t COMPLAIN when you then have the a) money, and b) time to pursue IVF. Be HAPPY that you are so fortunate.

I’m 31, do not have children of my own. I am in a relationship with a man who has a 6 year old and an 8 year old. We live together and he has shared custody of his kids. I am there too, doing homework, helping out with dinner. Some might say I have a family – and in some ways, yes, I definitely do – but they are not my children and never really will be. He and I have discussed having one of our own someday, but honestly, I don’t know that it will happen. But that’s ok.

Because I made a CHOICE. Just like that Ivy-League educated, rich lady did. We make choices. These choices involve tradeoffs. She’s lucky to even be able to MAKE that choice. She had access to birth control to avoid pregnancy, and enough education and income to have a “pretty freaking awesome” life. So have I – but I am owning up to the fact that I prioritized awesomeness in my 20s over procreating, and perhaps that means I may not reproduce.

Don’t make a choice and then complain about the result of your choice.

Another is “really shocked and saddened to read these stories about doctors avoiding discussing fertility with their patients”:

It makes me even more grateful for my OB/GYN who, when I was 36 (and still single), directly addressed the subject at my annual appointment:

Her: “Do you want to have children?”
Me: “Hmm.  I haven’t really thought about it.  I guess so.  At some point.”
Her: “Well, if you do, here’s the name and address of a fertility clinic.  You shouldn’t wait any longer.”

And when I walked out of her office,  it hit me – yes, I did want to be a mother, and yes, it was about time to do that.  And thanks to her (and an anonymous sperm donor), I have spent the last sixteen years being a mother, and I can’t believe that I almost missed this experience and this life that I love were it not for the not-very-subtle prodding of my OB/GYN.

Another reader reminds us:

The risk of Downs’ syndrome and some kinds of birth defects increases with the age of the mother, others with the age of the father. The mother – from the CDC:

unnamed (8)

On the other hand, risk of gastroschisis (intestines outside the body) decreases with age of the mother. For fathers, their increased age is associated with increased risk of bipolar disorder, schizophrenia, and autism.

Another reader tells multiple stories:

I can tell you all about fertility fog. (Sorry this got really long.)

When I was about 34, in a serious relationship with a younger man, but not engaged, a friend told me she wanted to talk to me about something serious.  She told me that I shouldn’t wait to have a baby, that she’d had age-related fertility problems at 35 and didn’t want the same thing to happen to me.  It was upsetting, because I wasn’t in a position to go home and try and get pregnant that night. But I didn’t dismiss her.  I told my boyfriend (now husband) what she’d said, and that, while I wasn’t giving him an ultimatum, I couldn’t wait forever because I wanted a family.  When I started openly talking about adoption or donor sperm, he got the hint, and we got engaged. I was married just before turning 35.

Lo and behold, I had problems getting pregnant (which may not actually have been caused by my age, but being older didn’t help).  I learned a lot about infertility and was very quick to seek aggressive medical help when it didn’t happen right away.  With lots of medical help, I was able to have three children, including twins.  (By the way, my gynecologist acted like my age was a non-issue, and if I’d left it up to her, I would have waited a lot longer to seek help from a fertility specialist. I am very grateful to my friend for lighting a fire under me.)

I was quite open with my friends, all career women, about our struggles, and very clear about the perils of waiting. They were mostly skeptical that fertility really starts to decline in your late 20s, and falls off a cliff after 35.  They latched on to the stories that comforted them. They would read about celebrities seeming to get pregnant very easily in their mid to late-40s and assumed that they would also be able to do so.  They seemed to think I was just bitter when I assured them that the celebrities were almost certainly using donor eggs.  I can’t tell you how vindicated I felt when it was revealed in Cheryl Tiegs’ divorce that she had indeed used donor eggs to have her children at 52.  She had gone around on talk shows telling people that the twins had been conceived using her eggs and her husband’s sperm because she’d taken such good care of herself that her reproductive organs were “young.”  (I know for a fact that her doctor would be asked at infertility conferences to do for these regular women what he’d done for Tiegs.  He couldn’t tell the truth, of course, because of doctor/patient confidentiality, but he hinted.)

Anyway, two stories.  One friend seemingly took me seriously.  When she got engaged in her late 30s, she planned a short engagement, and we had a sit down where I told her everything I knew about how to up her odds of getting pregnant quickly.  She was quite worried.  Lucky for her, she got pregnant easily and had a healthy baby.  I think she probably felt I’d made her panic for nothing.  She also apparently thought that her pregnancy cured her of her age.  Instead of trying quickly for a second child (she wanted several), she decided it would be ideal to have at least two years between children, and although she was now in her 40s, decided to wait that long to start trying again.  I tried to gently hint that she might not want to wait, all the same issues applied, but, to my regret, I wasn’t pushy about it, and didn’t tell her she was an idiot for waiting (even though that’s what I thought).

Well, she did get pregnant again after a while and miscarried.  She miscarried several times. Her miscarriages were almost certainly as a result of her older eggs.  But she refused to believe it and refused any treatment that might increase her odds of success.  She even called me crying once to tell me that her new doctor had told her that her miscarriages were the result of her age, and that she should see a fertility doctor right away while there was still some hope.  I told her the doctor was being reasonable, but it was like she was deaf.  She switched doctors!  She never had another child and was sad about it for many years.  I always have felt badly that I did not push her harder not to wait.  But I don’t think she would have heard me.  If she wouldn’t hear a doctor, why would she hear me?

Second story.  A friend was having no luck meeting a guy, so decided to go the donor sperm route.  She got all set up, picked the clinic, the donors, the whole thing, and was ready to begin.  Then one day we were having lunch and she told me she’d decided to wait at least another year.  (She was in her late 30s.) I again tried to gently encourage her not to wait.  I remember asking her whether, if she knew that by waiting she wouldn’t be able to have a child would she still wait and she said yes. She also said she would never pursue treatment.  I left it at that.  She knew my story.

A year or so later, she finally tried to get pregnant and it didn’t work.  She had tests done and the results were not great, her likelihood of success not good, but she decided to do IVF (which she said she’d never do) anyway.  Her several attempts failed pretty miserably.  And she told me that I “should have told her that it was a mistake to wait”!!!   I felt awful, but also a little pissed.  I did tell her.  I just didn’t TELL her.  It was probably already too late anyway, to be honest.  She ended up adopting a child, eventually married, has stepchildren and is very happy.  I don’t think she has any regrets now, so it worked out for the best, but there was a lot of heartache in between.

Map Of The Day

[youtube https://www.youtube.com/watch?v=pJxrTzfG2bo&start=13]

History professor Claudio Saunt created the above time-lapse, as well as a corresponding interactive map, to emphasize “the fact that the United States is built on someone else’s land”:

By the time the Civil War came to an end in 1865, it had consumed the lives of 800,000 Americans, or 2.5 per cent of the population, according to recent estimates. If slavery was a moral failing, said Lincoln in his second inaugural address, then the war was ‘the woe due to those by whom the offense came’. The rupture between North and South forced white Americans to confront the nation’s deep investment in slavery and to emancipate and incorporate four million individuals. They did not do so willingly, and the reconstruction of the nation is in many ways still unfolding. By contrast, there has been no similar reckoning with the conquest of the continent, no serious reflection on its centrality to the rise of the United States, and no sustained engagement with the people who lost their homelands.

Update from a reader:

“The fact that the United States is built on someone else’s land”. Um, isn’t this true of every modern nation? Isn’t it true that the Native Americans fought and killed each other to take land from their neighbors?

(Hat tip: Nathan Yau)

Americans Need A Raise

But Dean Baker doubts it’s in the cards for the majority of them:

[M]ost workers are unlikely to see wage growth until the labor market has far less unemployment than at present. If the economy continues to add 240,000 jobs a month, we may be at this point somewhere in 2016, but we aren’t there now and we will almost certainly not be there any time in 2015. While the unemployment rate has fallen most of the way back to its pre-recession level, this is largely because millions of unemployed workers have dropped out of the labor force and are no longer counted as unemployed. Contrary to what is often claimed, this is not a story of aging baby boomers retiring.

Millions of prime age workers (ages 25-54) did not just suddenly decide to retire. They left the labor force because of weak labor demand. The number of people involuntarily working part-time is still 2 million above its pre-recession level. Furthermore, the share of unemployment due to people voluntarily quitting their previous jobs, a measure of confidence in the strength of the labor market, remains well below its pre-recession level.

[T]he National Federation of Independent Business Small Business Optimism survey, which just reached its highest level since October 2006, shows that the shares of small businesses reporting that they either recently raised wages or plan to in the next three months are at new post-recession highs … These numbers seem a bit hard to square with data from last week’s jobs report showing that wages have been flat for so long. Plans for future hiring in the NFIB index are also up, as are the number of job openings in another Labor Department survey released today; both suggest that upward pressure on wages should be rising, particular given the shrinking number of available idle workers out there.

Ben Casselman admits that it “isn’t clear why wages aren’t rising faster”:

One possibility is that the economy just hasn’t improved enough yet. Maybe if the workers-to-jobs ratio improves a bit more, employers will finally have to start offering raises. Or perhaps the ratio exaggerates improvements in the labor market because there are lots of workers who want jobs but don’t officially count as unemployed. (That can’t be the whole story, though. As I noted back in August, the workers-per-opening measure is improving even under less strict definitions of unemployment.)

The other possibility is that there are deeper, so-called structural reasons for the slow growth in wages, an issue I discussed in a bit more depth at the start of the year. Economists point to a range of possible explanations: globalization, technological change, declining unions, rising regulation. What they all share is that they won’t be solved through economic growth alone.

A Western Blind Spot? Ctd

A reader responds to the question of Western coverage of Boko Haram compared to Charlie Hebdo:

A blind spot? Not at all. Violence in places with strong traditions of law and order is news.  Violence in places where there is less of such a tradition is not. Francis Fukuyama in his seminal two-volume work reveals to us how hundreds of years of effort under favorable geographic and cultural circumstances are required even to hope for such traditions. There is nothing racist or materialist about the lack of coverage.  It is nothing more than human beings refusing to regard what can plainly be anticipated as news.

The Dish, which is a fairly representative gauge of Western media coverage, has produced about 25 posts on Boko Haram overall. CNN had the terrorist group in its top 10 list of most talked about stories of 2014. Regarding the most recent massacre, we posted two roundups that included pieces from the AP, WaPo, Time, TNR, New Yorker, Reuters, Bloomberg, BBC, Guardian, etc. – hardly a blind spot of Western media, though others will argue that coverage wasn’t quick enough. Matt Schiavenza is on the same page as our reader:

The main difference between France and Nigeria isn’t that the public and the media care about one and not the other. It is, rather, that one country has an effective government and the other does not.

The French may not be too fond of President Francois Hollande – his approval ratings last November had plunged to 12 percent – but he responded to his country’s twin terror attacks with decisiveness. Not so Nigeria’s Goodluck Jonathan. Since assuming the presidency in 2010, Jonathan has done little to contain Boko Haram. The group emerged in 2002 and has consolidated control over an area larger than West Virginia. And it’s gaining ground. Perversely, the seemingly routine nature of Nigeria’s violence may have diminished the perception of its newsworthiness.

Along those lines, Charlotte Alters adds:

The reports coming out of Baga are still sketchy, and there’s not yet an official death toll because Boko Haram still controls the area. The details of the Charlie Hebdo attacks were immediately available, and were accompanied by compelling video that quickly dominated every major news network. … More importantly, the attack in Paris was largely unprecedented (Charlie Hebdo was firebombed in 2011, but nobody was hurt), while the massacre in Nigeria is part of a long string of Boko Haram attacks that some are even calling a “war“: the group killed over 10,000 people last year, according to the Council on Foreign Relations, and 1.5 million have fled their homes since the insurgency started. Plus, the fact that the Charlie Hebdo attack was a dramatic ambush of journalists may have added a layer of panic to the media coverage.

The latest on those “sketchy” reports:

Now, new satellite imagery obtained and released by Amnesty International and Human Rights Watch (HRW) appears to confirm that considerable damage occurred in the towns of Baga and Doro Gowon in Borno state.

Screen Shot 2015-01-15 at 3.33.36 PM

The scale of the damage is remarkable. Adotei Akwei, managing director of government relations at Amnesty International USA, says that the images and other evidence suggested that the death toll was “certainly 700, if not 2,000 or close to it.”

Meanwhile, a few more readers add personal perspective to both the coverage and coverage of the coverage:

In 1984, I was with a group of Canadian volunteers who came to teach English in Nigeria. We spent two weeks in Yola, then moved out to our postings in the state. Within two weeks after our departure, the central part of Yola (Jimeta) was in flames. There had been a conflict between police and a heretical Muslim sect called the Maitatsine. The crisis went on for days, and was only resolved, as we were told, after the army bombarded the central part of the city. The report was that 800 people had been killed.

Here’s the relevant part with respect to your story on Boko Haram: when the Canadian organisation contacted our families to reassure them that we were all right, it turned out that they didn’t know about the violence at all. 800 people (apparently) had died in Nigeria, and the Canadian media ran nothing about it.

Media silence on Boko Haram has only become an issue now because Western media were running stories on it in the first place, and then they died down. In 1984, the situation was worse and got no attention.

Another writes, “Try putting your son on a plane back to Nigeria”:

Thank you for bringing attention to Boko Haram, and specifically linking to Hilary Matfess’s piece. She’s got it exactly right. Boko Haram is not ISIS or Al Qaida fighting for a caliphate, exactly. Rather, it’s more like the rebelling army in a Civil War. And, if you ask an average Nigerian, it’s not so black and white as our media reports as to whom are the bad guys.

Boko Haram does this crazy shit because they’re nuts, don’t get me wrong. But Nigeria is the most corrupt country on the planet. Goodluck Jonathan sits on the top of the money pile, denying everything and doing nothing. The north has been marginalized politically and economically for way too long. Anyone who has been marginalized – going without clean water while politicians buy houses and art in Malibu – can tell you that when you have nothing, you have nothing to lose.

On a similar note, my son’s students are offspring of oil executives and corrupt politicians (they freely say that’s their parents’ job title), as well as Pakistanis who have survived their villages’ drone strikes by the US. A mix of students from such backgrounds gives you perspective. When he teaches Speech, for example, students are required to research, deconstruct, and present an important speech from someone they admire. You would expect the choices of most of the students, but some of them choose from Osama bin Laden. When your best friend or family was killed in a US drone strike, you don’t expect them to pick the Gettysburg Address.

The Effect Of Obamacare

It’s starting to show up:

Every two years, the Commonwealth Fund surveys Americans on how difficult it is to afford medical care. The 2014 survey showed something new: for the first time in a decade, the number of Americans who say they can afford the health care they need went up.

John Tozzi provides necessary context:

A bunch of surveys and analyses have shown that insurance coverage is increasing under the Affordable Care Act (ACA). The new data are important because they show that financial barriers to care are coming down as well—even though many Obamacare plans come with high deductibles and other cost-sharing that makes patients shoulder greater costs.

Tara Culp-Ressler doesn’t overlook at those who still can’t afford treatment:

The high cost of health care remains an issue for millions of Americans; according to Commonwealth, there are still about 66 million adults who reported skipping out on care last year because they couldn’t afford it. And previous studies from the organization have documented a trend in employers pushing more health costs onto their workers, leaving some Americans struggling to pay their deductibles and co-pays. Medical debt is one of the leading causes of bankruptcy in the United States. Still, the new report provides significant evidence that the Affordable Care Act is taking steps to tackle the problem.

Peter Blair is skeptical of Commonwealth’s findings:

[A]ccording to the Commonwealth Fund, the growth of high deductible plans could reverse some of these trends and access problems still remain acute for lower-income Americans. And even those may not be caveats enough, given that Gallup’s study of the access question reached the opposite conclusion. According to that poll, “Despite a drop in the uninsured rate, a slightly higher percentage of Americans than in previous years report having put off medical treatment, suggesting that the Affordable Care Act has not immediately affected this measure.”

Lastly, in not so good news for Obamacare, Ben Casselman calculates that law has reduced some workers’ hours:

Taken together, the evidence suggests that the health law has likely led a few hundred thousand workers to see their hours cut or capped. That’s small in the context of an economy with 150 million workers. But it isn’t a minor issue for those workers. Most of them are among the economy’s most vulnerable: low-wage, part-time workers who likely have few other options.

But much of the increase in part-time work is voluntary:

The Congressional Budget Office, in a report before the law had fully taken effect, estimated that millions of Americans would voluntarily shift to part-time status once they no longer needed to work full-time to qualify for health benefits. It’s hard to know exactly how many have already done so. But there has been a marked increase in recent months in the number of people reporting they are working part-time by choice. It’s possible, as liberal economist Dean Baker has suggested, that this rise reflects the effect of the health law.