Shyam's Slide Share Presentations


This article/post is from a third party website. The views expressed are that of the author. We at Capacity Building & Development may not necessarily subscribe to it completely. The relevance & applicability of the content is limited to certain geographic zones.It is not universal.


Saturday, October 31, 2015

Enormous, ‘catastrophic’ landslide in northern Canada almost went undetected by humans 10-31

It was enough rock and ice to equal the weight of 33 million pickup trucks. And for an earth-shattering two minutes, it barrelled down a mountain at nearly 200 km/h, pulverizing everything in its path.

And the Yukon didn’t even know it was there.

“Most likely it would have been missed,” said Colin Stark, one of two Columbia University geologists who spotted the “catastrophic” event from more than 5,000 kilometres away.

Expected to rank among the world’s 10 largest landslides for 2015, the slide saw a massive chuck of rock and ice slough off Mount Steele, Canada’s fifth highest peak.

If the Oct. 11 slide had struck in a populated region of India or China, hundreds would be dead. Even in Canada’s sparsely populated Rocky Mountains, it would have at least blocked a highway or buried a wilderness lodge.

But it occurred in a remote corner of Kluane National Park, Canada’s Wales-sized contribution to a swath of protected land stretching all the way to Anchorage, Alaska.


“That particular area, as the crow flies, is about 70 kilometres to the Alaska Highway; people aren’t back there very often,” said Jeff Bond with the Yukon Geological Survey.

Quite often, said Bond, the only way remote Yukon landslides are recorded is if some bystander happens to phone it in.

“We hear about them from helicopter pilots who say, ‘hey, did you guys know about this big slide?’ ”
In the summer of 2014, for instance, a landslide plowed into a creek within Kluane National Park, violently creating Canada’s newest lake. But it wasn’t until six months later that the change happened to be spotted by an off-duty parks employee who was out for a hike.

Credit for discovering the Oct. 11 avalanche falls to Stark and his research partner, Göran Ekström.
Operating out of Columbia University’s Department of Earth and Environmental Sciences, they have pioneered a way of discovering landslides by sifting through data typically used to detect earthquakes.

In the case of Mount Steele, one of the few witnesses to more than 45 megatons of falling rock and ice was “YUK8,” an unmanned Canadian seismic monitoring station located only 25 kilometres away.

Confirmation then came in via Landsat 8, a NASA Earth-monitoring satellite that focused in on the remote mountain and noted that vast areas of the surrounding peaks and glaciers had been blackened by debris.

Jsayre64/Wikipedia Commons

“The fact that the debris fell on ice contributed to the long debris field,” NASA wrote in a subsequent post.

Named for Sam Steele, the Yukon’s indomitable top Mountie during the Klondike Gold Rush, Mount Steele has proved to be surprisingly delicate of late.

In 2007, the mountain first found its way into geology textbooks with a slide that buried an area larger than Vancouver’s Stanley Park, and shook the ground with the force of a 5.2 earthquake.
That time, the only witnesses were a research team led by glaciologist Garry Clarke, who just happened to be nearby when they heard the distinctive roar of a mountain falling over.
“They got dusted by the ice avalanche, actually,” Bond said.

Mount Steele is part of the Saint Elias Mountains, which includes Canada’s highest peak, Mount Logan. Amid regular earthquakes and landslides, it’s also a good candidate for Canada’s most geologically interesting corners.

Just last year, in fact, a mountain face broke off nearby Mount La Perouse, in Alaska, creating one of North America’s largest known natural landslides.

View at the original source

Sunday, October 11, 2015

Do today’s philanthropists hurt more than they help? 10-11

Do today’s philanthropists hurt more than they help?

As philanthropy enters a second golden age, real social change is getting lost in the hype of market-based giving.

In one of his short stories, “Counterfeit Money,” Charles Baudelaire describes a fictional encounter between two friends who come across a beggar in the street.
In this short piece, the narrator and friend offer the beggar spare change, but the friend offers a much larger coin. The narrator commends him for his generosity. The friend accepts the compliment and then adds, once they’re out of earshot of the beggar, “It was a counterfeit coin.”
The narrator is astounded. Not only has his friend duped the beggar on purpose, what’s worse, he feels self-congratulatory for his gift. His satisfaction lies in the fact that the beggar doesn’t realize that he has been duped. The narrator sees that his friend’s “aim had been to do a good deed while at the same time making a good deal; to earn forty cents and the heart of God; to win paradise economically; in short to pick up gratis the certificate of a charitable man.”
Baudelaire’s story was written in the latter half of the 19th century, a time when industrialists such as Andrew Carnegie and John D. Rockefeller Sr. began channelling their vast fortunes into some of the greatest acts of philanthropy ever known. From Carnegie’s spending on public libraries to Rockefeller’s investment in biomedical advances, their giving helped to shift charity from the dispensing of alms in a largely unsystematic manner to a business in itself, overseen by paid philanthropic advisors.

Many did not feel grateful for the robber barons’ generosity, however. In his essay, “The Soul of Man Under Socialism,” Oscar Wilde berated the tendency of benefactors to use their charity as a bulwark against redistributive demands.
“The best among the poor,” Wilde wrote, “are never grateful. They are ungrateful, discontented, disobedient, and rebellious. They are quite right to be so … Why should they be grateful for the crumbs that fall from the rich man’s table? They should be seated at the board, and are beginning to know it.”
As philanthropy enters a second golden period, with the gifts from benefactors such as Bill Gates and Warren Buffett rivaling those offered toward the end of the gilded age, sceptics are starting to ask: Are Wilde’s and Baudelaire’s concerns still relevant? Are today’s philanthropists knowingly dispensing “false coins?” Are they trying to “pick up gratis the certificate of a charitable man?”
In most cases, the answer is a firm no. Charity is dispensed in good faith, with empathy toward close and distant strangers. And yet, at the same time, a new trend is growing: philanthrocapitalism, a more muscular philanthropy that seeks to combine profits with poverty alleviation. The effort to do a good deed while at the same time making a good deal is the driving impetus behind the new philanthropy. Another question Baudelaire raised still lingers: Who benefits more from charitable acts, the giver or the receiver?
At the forefront of the new philanthropy is the effective altruism movement, something upheld as radically different from earlier philanthropic approaches through a purportedly novel emphasis on measuring the results of giving. A pioneer in the movement is Peter Singer, the controversial bioethicist who has praised Buffett and Gates for being “the most effective altruists in history.”
His praise rests on the magnitude of their giving rather than evidence of their effectiveness. It’s true that in dollar terms, their generosity is jaw-dropping. Joel Fleishman points out in The Foundation that Buffett’s 2006 announcement of a gift of $31 billion to the Gates Foundation represented, in 2006 dollars, more than Rockefeller Sr. and Carnegie gave away combined.
But as a proportion of the overall U.S. Gross Domestic Product, the size of today’s foundations pales next to their predecessors. The Ford Foundation’s endowment in the early 1960s represented more than double the share of U.S. GDP in comparison to the Gates Foundation 50 years later. Ever since the 1970s, overall charitable giving in the U.S. “as a share of GDP has rarely strayed far from 2 percent,”Suzanne Perry points out in the Chronicle of Philanthropy, “despite the huge growth in the number of charities and fundraisers and periodic crusades to encourage greater giving.”

Corporations have become far stingier. Mark Kramer and Michael Porter pointed out in the early 2000s that corporate philanthropy as a proportion of corporate profits dropped since the 1980s. Since then it’s sunk even further, from 2.1% of pretax profits in the mid-1980s to 0.8% in 2013.
Singer’s presumption that Buffett and Gates are any more effective than earlier philanthropists isn’t backed by data. Some of the Gates Foundation’s work has led to measurable gains. Vaccination rates are rising; global child mortality has fallen—the foundation’s work in global health has contributed to these gains. But in comparison to government donors, Gates Foundation grants are a small drop in the global health landscape: The U.S. government has committed over $65 billion to global HIV/AIDs programs alone. That’s double the amount of overall giving by the Gates Foundation toward U.S. education, global health, and global agriculture since its inception.
To date, there has been far more hype than hard evidence about effective altruism’s achievements; its progress often seems to be measured and underpinned by self-sustaining feedback loops. Donors privilege what critics see as low-hanging fruit: aid projects where measuring the effect is relatively easy to do.
We hear a lot about the positive effects of different programs, such as the benefits of deworming efforts worldwide that were once thought to have contributed dramatically to education attainment in developing nations, until a recent review from independent health research group Cochrane cast doubt on that link. Far less attention is paid to counterfactuals, such as the cost to welfare programs when tax revenue is lost as a result of philanthropists receiving lucrative tax exemptions for pet projects.
Today’s philanthropy enthusiasts are never short on hyperbole. An organizer of a recent effective altruism conference at Google’s Quad campus in Mountain View reportedly averred that “effective altruism could be the last social movement we ever need.” But it’s clear that rises in global giving over the past 10 years have not made a dent in reducing economic inequality in rich nations such as the United States or Britain.
Individual philanthropic foundations have grown at a fast clip in the U.S. over the past 15 years: In that span, the number of individual foundations has doubled from about 40,000 to over 85,000. But this surge hasn’t helped alleviate extreme poverty. A 2012 report from the National Poverty Center at the University of Michigan points out that within the U.S., “the prevalence of extreme poverty rose sharply between 1996 and 2011.”

One of the biggest ironies facing 19th-century philanthropy was the question of whether growing charity simply exacerbated economic inequality by thwarting demands for better wages and the right to unionize.
Carnegie published his first “Wealth” essay, in which he urged the rich to share their spoils, just a few years before the Homestead battle of 1892, one of the bloodiest labor standoffs in U.S. history, where he brutally stamped out burgeoning union efforts even while liberally dispensing charity to his workers. “Paradoxically,” David Nasaw, Carnegie’s biographer, has pointed out, “Carnegie … became, if anything, more ruthless in pursuit of profits once he had determined that those profits would be distributed during his lifetime.”
“In remonstrating that only the millionaire could be trusted to dispense his millions, and that whatever that millionaire thought ‘best’ was best,” Nasaw adds, “Carnegie was promulgating a profoundly antidemocratic gospel, almost feudal in its paternalism.”
Effective altruists insist that private charity is the best means for improving livelihoods. ‘Today’s philanthrocapitalists see a world full of big problems that they, and perhaps only they, can and must put right,’ Matthew Bishop and Michael Green write inPhilanthrocapitalism: How Giving Can Save the World, a book that’s become something of a bible for the new philanthropists.
In contrast to claims of novelty, the results-oriented approach of today’s donors is little different than Carnegie or Rockefeller, who were both outspoken about the need to give away their money in an efficient and effective manner.
And just as in Carnegie’s day, philanthropy is often upheld as justification for gross profiteering.
“I donated a total of $5,000,000 to various causes recently. Looking forward to telling you all about it,” Martin Shkreli, the CEO of Turing Pharmaceuticals who was vilified for raising the price of Daraprim by 5,000%, tweeted in mid-September.
This is a prime example of philanthrocapitalism in action: the use of philanthropy to thwart attention to business practices that hamper access to life-saving medicines. And much like in Carnegie’s time, many aren’t buying it.

Friday, October 2, 2015

3D Computer Chips Could Be 1,000 Times Faster Than Existing Ones 10-02

3D Computer Chips Could Be 1,000 Times Faster Than Existing Ones

© Max Shulaker A new method of creating computer chips could provide much faster performance than was previously possible. The new design uses a special material called carbon nanotubes, which allows memory and processor layers to be stacked in three…

A new method of designing and building computer chips could lead to blisteringly quick processing at least 1,000 times faster than the best existing chips are capable of, researchers say.

The new method, which relies on materials called carbon nanotubes, allows scientists to build the chip in three dimensions.

The 3D design enables scientists to interweave memory, which stores data, and the number-crunching processors in the same tiny space, said Max Shulaker, one of the designers of the chip, and a doctoral candidate in electrical engineering at Stanford University in California.

Reducing the distance between the two elements can dramatically reduce the time computers take to do their work, Shulaker said Sept. 10 here at the "Wait, What?" technology forum hosted by the Defense Advanced Research Projects Agency, the research wing of the U.S. military.

Progress slowing

The inexorable advance in computing power over the past 50 years is largely thanks to the ability to make increasingly smaller silicon transistors, the three-pronged electrical switches that do the logical operations for computers.

According to Moore's law, a rough rule first articulated by semiconductor researcher Gordon E. Moore in 1965, the number of transistors on a given silicon chip would roughly double every two years. True to his predictions, transistors have gotten ever tinier, with the teensiest portions measuring just 5 nanometers, and the smallest functional ones having features just 7 nanometers in size. (For comparison, an average strand of human hair is about 100,000 nanometers wide.)
The decrease in size, however, means that the quantum effects of particles at that scale could disrupt their functioning. Therefore, it's likely that Moore's law will be coming to an end within the next 10 years, experts say. Beyond that, shrinking transistors to the bitter end may not do much to make computers faster.

Long commute time

The main roadblock to faster computers is not flagging processor speed, but a memory problem, Shulaker said.

Big-data analysis requires the computer to draw some tiny piece of data from some previously unknown spot in truly staggering troves of data. Then, the computer must shuttle that information via an electrical signal back and forth across the (relatively) vast inches of wire between the computer's memory (typically a hard drive) and the processors, facing the speed bump of electrical resistance along the entire path.

"If you try to run that in your computer, you would spend over 96 percent of the time just being idle, doing absolutely nothing," Shulaker said. "You're wasting an enormous amount of power." While the Central Processing Unit (CPU) waits for a piece of data to make the return trip from the memory, for instance, the computer is still hogging power, even though it's not calculating a thing.
Solving the memory-CPU "commute time," however, is tricky. The two components can't be put in the same wafer because silicon-based wafers must be heated to about 1,800 degrees Fahrenheit (1,000 degrees Celsius), while many of the metal elements in hard drives (or solid state drives) melt at those temperatures, Shulaker said.

Carbon nanotubes

To get around this issue, Shulaker and his advisers at Stanford University, Subhasish Mitra and H.-S. Philip Wong,  looked to a completely different material: carbon nanotubes, or miniscule mesh rods made of carbon atoms, which can be processed at low temperatures. Carbon nanotubes (CNTs) have electrical properties similar to those of conventional silicon transistors.

In a head-to-head competition between a silicon transistor and a CNT transistor, "hands down, the CNT would win," Shulaker told Live Science. "It would be a better transistor; it can go faster; it uses less energy."

However, carbon nanotubes grow in a disorderly manner, "resembling a bowl of spaghetti," which is no good for making circuits, Shulaker said. As such, the researchers developed a method to grow nanotubes in narrow grooves, guiding the nanotubes into alignment.
But there was another hurdle. While 99.5 percent of the nanotubes become aligned, a few stragglers will still be out of position. To solve this problem, the researchers figured out that drilling holes at certain spots within the chip can ensure that even a chip with wayward tubes would work as expected. 

Another problem is that while most CNTs have the properties of a semiconductor (like silicon), a few act just like an ordinary conducting metal, with no way to predict which tubes will misbehave. Those few conducting tubes can ruin an entire chip, and having to toss even a fraction of the chips wouldn't make financial sense, Shulaker added. As a remedy, Shulaker and his colleagues essentially "turn off" all the semiconducting CNTs, leaving huge jolts of current to circulate through the remaining conducting nanotubes. The high current heats up and breaks down only the conducting nanotubes, which blow like nano-scale fuses, Shulaker said.

In 2013, the team built a CNT computer, which they described in the journal Nature. That computer, however, was slow and bulky, with relatively few transistors.

Now, they have created a system for stacking memory and transistor layers, with tiny wires connecting the two. The new 3D design has slashed the transit time between transistor and memory, and the resulting architecture can produce lightning-fast computing speeds up to 1,000 times faster than would otherwise be possible, Shulaker said. Using the new architecture, the team has built a variety of sensor wafers that can detect everything from infrared light to particular chemicals in the environment.

The next step is to scale the system further, to make even bigger, more complicated chips.

View at the original source