Archive for October, 2009

Using Cell Phones for Exams

October 28th, 2009 No comments

Using Cell Phones for Exams
Op-ed submission to the Sydney Morning Herald
By Marc Prensky

I have watched with interest, in far-off America, the media storm raised by the use of mobile phones for exams at Presbyterian Women’s College, Croyden, based – at least in part – on my recommendations. I was even interviewed by phone by two Australian radio stations

This high level of media interest – which places the idea of permitting twenty-first century devices in our teaching and evaluation squarely in the public conversation about how we educate our kids – is as it should be, and here is why:

Too many of today’s adults are of the opinion that their children’s education should remain exactly as it was when they were educated in a period before digital technology, the Internet, and other twenty-first century innovations. Unfortunately, this attitude, if implemented, prepares our children not for the future they will face in their lifetimes, but only for the past.

In particular, the idea of testing a person without all the tools they will have at their disposal in the real world is no longer appropriate. It is akin to asking person to tell you the time, but not letting them consult their watch. Every plumber or doctor, or musician is tested with the tools and instruments of their trade. Can you imagine an examiner saying “OK, doctor, tell me about this patient’s heartbeat – but leave that stethoscope in your pocket!” Or that doctor’s having a question about a diagnosis and not phoning a colleague?

The attitude that we should know as many facts as possible, and hold in our heads every trivial piece of information we might need to use in our lives was useful in a time when the body of knowledge was much smaller and information was much harder and slower to find. Memorizing phone numbers allowed you to dial faster. Memorizing the multiplication tables saved you the trouble of adding. Memorizing the names of places was helpful when maps were not always available.

But those were, in the words of one 10-year-old, the “olden days.” Today’s kids store numbers on their phones, use the calculator in the phones to multiply and divide, and, increasingly, tell the time from the phones as well. This frees their mind, ideally, to think of more important things than what is increasing known as “trivia” – IF they are taught to do so, and IF they are evaluated on that ability, rather than on what they have memorized.

Understanding of key concepts is just as important today as in the past – perhaps even more so. But one can understand what a map of the world looks like and represents without being able to name every country and capital. How completely and accurately could you, the reader, fill in a blank map of Africa, or of the former Soviet Union, or of the former Yugoslavia, without help?

There is nothing shameful or “uneducated” about this, because this information, in the twenty-first century, is easily findable.

Given this, smart educators, like those at PLC, assume the availability of such facts via the students’ always-on devices, and on exams, ask harder questions, such as “What do these facts mean?” or “How do we interpret this information?”

And as for those who raise the scenario of technology breaking down, or of someone’s forgetting, or not owning the tools, I remind those people of what we all do whenever we leave our watch at home, or when its battery runs down: we just ask someone else for the time. It is not doing this that would really seem “uneducated” in the twenty-first century.

Marc Prensky is an internationally acclaimed thought leader, speaker, writer, consultant, and game designer in the critical areas of education and learning. He is the author of Digital Game-Based Learning (McGraw Hill, 2001) and Don’t Bother Me, Mom, I’m Learning (Paragon House, 2006). Marc is the founder and CEO of Games2train, a game-based learning company, whose clients include IBM, Bank of America, Pfizer , the U.S. Department of Defense and the LA and Florida Virtual Schools. He is also the creator of the sites, . Marc holds an MBA from Harvard and a Masters in Teaching from Yale. More of his writings can be found . Marc can be contacted at .

Categories: Development Tags:

Qualcomm to focus on low-cost computing, 3G market in India

October 28th, 2009 No comments
Qualcomm to focus on low-cost computing, 3G market in India

These chips power connectivity in mobile phones that are cheaper than $20 but can be used in a variety of other applications, such as connecting heart monitors to hospitals or car sensors to a traffic grid

K. Raghu

Bangalore: Mobile-phone chip maker Qualcomm Inc. will take back to the US low-cost chips it has designed for emerging nations like India to use them in applications such as connecting electricity meters to power grids, a top executive said.

These chips power connectivity in mobile phones that are cheaper than $20 (about Rs1,000) but can be used in a variety of other applications, such as connecting heart monitors to hospitals or car sensors to a traffic grid.

“You are no longer talking about people (talking) to people,” Kanwalinder Singh, president of Qualcomm’s India unit, said late Wednesday on the sidelines of a conference.

Having produced affordable semiconductor chips and modules, the firm is able to take these to other markets for machine-to-machine applications, he added.

Cellular connections in the US that enable wireless data calls between machines will increase threefold by 2014 from the current 75 million, technology market research firm ABI Research said on 22 September.

Qualcomm’s India centres in Bangalore and Hyderabad, which employ at least 1,000 people, have played a major role in designing single-chip handsets, driving down costs of such phones to $20 from $80 five years ago, said Singh.

The firm dominates the so-called code division multiple access (CDMA) technology used by more than 100 million cellphone users in India. It is now supplying these chips to the US for a year now, Singh said, without giving specific numbers.

In India, Qualcomm is set to begin trials of a low-cost computer called Kayak by the end of this year. Kayak, named after the boats used by American Indians and known for their simplicity, are designed for high-speed broadband connectivity on third generation (3G) spectrum.

India is expected to auction 3G spectrum later this year, which will allow cellular operators to offer high-speed Internet access to content such as video on mobile phones.

“We have the reference design ready. We are talking to operators, application providers so as to ensure that the right applications work,” said Singh.

Kayak, which should cost less than Rs10,000, will have a monitor and a wireless modem, allowing users to access applications and software over the Internet and download some of them locally.

Qualcomm is working with the Azim Premji Foundation, the non-profit education initiative of Azim Premji, chairman of Wipro Ltd, India’s third largest software vendor, to host on the Internet the local language education material it has built for school students.

“Our aim is to demonstrate to policymakers that if you provide connectivity, you can reap benefits in the education sector,” said Sukumar Anikar, head of technology for education at Azim Premji Foundation. “It is not only students, but even teachers who will benefit.”

Qualcomm will licence the technology to hardware vendors and earn royalty on chips sold once Kayak is commercialized by 2010, Singh said.

“We will test them in emerging countries because we are targeting under-penetrated markets,” he said. “Surprisingly, we are getting lot of enquiries (for Kayak) even from developed markets.”

Categories: Development Tags:

Words from Bill Gates

October 28th, 2009 No comments

Love him or hate him, he sure hits the nail on the head with this!
Bill Gates recently gave a speech at a High School about 11 things they did not and will not learn in school.
He talks about how feel-good, politically correct teachings created a generation of kids with no concept of reality and how this concept set them up for failure in the real world.

Rule 1: Life is not fair – get used to it!

Rule 2: The world won’t care about your self-esteem. The world will expect you to accomplish something BEFORE you feel good about yourself.

Rule 3: You will NOT make $60,000 a year right out of high school. You won’t be a vice-president with a car phone until you earn both.

Rule 4: If you think your teacher is tough, wait till you get a boss.

Rule 5: Flipping burgers is not beneath your dignity. Your Grandparents had a different word for burger flipping: they called it opportunity.

Rule 6: If you mess up, it’s not your parents’ fault, so don’t whine about your mistakes, learn from them.

Rule 7: Before you were born, your parents weren’t as boring as they are now. They got that way from paying your bills, cleaning your clothes and listening to you talk about how cool you thought you were. So before you save the rain forest from the parasites of your parent’s generation, try delousing the closet in your own room.

Rule 8: Your school may have done away with winners and losers, but life HAS NOT. In some schools, they have abolished failing grades and they’ll give you as MANY TIMES as you want to get the right answer. This doesn’t bear the slightest resemblance to ANYTHING in real life.

Rule 9: Life is not divided into semesters. You don’t get summers off and very few employers are interested in helping you FIND YOURSELF. Do that on your own time.

Rule 10: Television is NOT real life. In real life people actually have to leave the coffee shop and go to jobs.

Rule 11: Be nice to nerds. Chances are you’ll end up working for one.

If you agree, pass it on.
If you can read this – Thank a teacher!

Categories: Motivational Tags:

A World Wide Woe

October 27th, 2009 No comments

A World Wide Woe

Internet addiction sounds like a punch line. But it ruined my brother’s life.

By Winston Ross | Newsweek Web Exclusive

Oct 8, 2009

Last Friday I walked into the most recent inpatient Internet addiction treatment center to open in the United States and asked a really dumb question. “Do you have Wi-Fi here?” I bumbled, prompting an awkward smile from the man who opened the door at the Fall City, Wash.-based ReSTART Internet Addiction Recovery Program. It was the equivalent of walking into an Alcoholics Anonymous meeting and asking for a single-malt Scotch.

It was also revealing. I hadn’t checked my e-mail,Facebook, or Twitter accounts for nearly 14 hours by the time I showed up at the wooded five-acre retreat, situated with some irony less than 15 miles from Microsoft Corp.’s Redmond headquarters. That drought had begun to eat away at me enough that by the time I walked through the door I was so fixated on plugging back in that my brain was able to push past the blatant insensitivity it took to ask such a question.

Most of my friends smirked when I told them I was heading up to Washington to write a story about the newly opened center, which sits on a wooded parcel of property adorned with a 3,500-square-foot craftsman house, Western red cedar treehouses, chicken coops, and goat pens. We all kid about being hooked on Facebook, but it doesn’t really seem like the kind of thing anybody would need to drop $14,000 (the cost of a 45-day stay at ReSTART) on to quit cold turkey. The fact is, though, I have believed for some time now that Internet addiction is a very real phenomenon. And not just because I’ve read stories about the well-established and at-capacity treatment centers in China and South Korea, or because I know antisocial kids who routinely put in 14-hour shifts playing World of Warcraft. Internet addiction is the reason my 36-year-old brother has been homeless for most of his adult life.

I hadn’t really understood this until recently, because having a homeless brother always terrified me too much to make any real effort to understand why Andrew could never get his life together. A couple of years ago I decided I’d protected myself from this depressing truth long enough. I contacted my brother and said I wanted to spend a day with him, from the moment he awoke to the time he went to sleep, to see what his life was like. I approached the trip with a journalist’s curiosity and method—a pen and steno pad—but it was obviously going to be a personal expedition.

Andrew, who is four years older than I am, sleeps in a roomy tent, atop three mattresses he’s acquired from one place or another, between a set of railroad tracks and Oregon State Highway 99, in a clearing ringed by blackberry bushes. He lives most days the same way. He gets up when he feels like it, walks to the local Grocery Outlet, and uses food stamps to buy a microwaveable meal. Then he treks over to the local soup kitchen and enjoys a free lunch, answering the greetings of his other homeless pals, who speak to me highly of the obese, bearded man they call “Ace.”

When the rest of his buddies head off to the park to suck down malt liquor or puff weed, Andrew eyes a different fix at the Oregon State University computer lab, which is open to the public. He’ll spend the next 10 hours or so there, eyes focused on a computer screen, pausing only to heat up that microwaved meal. He plays role-playing videogames such as World of Warcraft, but he’s also got a page of RSS feeds that makes my head spin, filled with blogs he’s interested in, news Web sites, and other tentacles into cyberspace. He goes “home” only when the lab closes. He’s recently acquired a laptop, after much fundraising from sympathetic relatives, so he can now stay connected day and night, if he can find an open Wi-Fi hot spot.

Through the day I peppered him with questions, all meant to answer this one: why had he failed to make something of himself, and I hadn’t? It was a complicated question, but it was pretty clear by the day’s end that the most detrimental influence in his life, from an early age, was videogames and the Internet. We were both exposed to computers early on, but he had let them consume his identity.

Andrew was a child of Commodore 64s and online bulletin boards, and he was fascinated with them as far back as I can remember. With his early knowledge of computers and programming, I like to think he could have become an heir to Bill Gates, but he spent most of his time online goofing off, not developing software. Though my brother has never been officially diagnosed as an Internet addict, he readily admits that he demonstrates all of the signs and symptoms of the compulsion. His was a world of constant refreshing, immediate access to new information and stimuli. Before long, the real world couldn’t hold his attention span. He dropped out of high school and spiraled down a path that eventually led him to homelessness.

The Internet is addicting, says psychologist David Greenfield, founder of the Center for Internet and Technology Addiction in West Hartford, Conn., because it works on a “variable ratio reinforcement schedule,” which is a fancy way of saying that it gets you high every once in awhile. This is based on a theory first espoused by renowned psychologist B. F. Skinner—not knowing whether a reward is coming is actually more compelling than being able to count on results every time.

“It can be as simple as finding an e-mail you like, hearing from somebody you love, being told a cousin is coming to visit, interspersed among a lot of neutral, less-salient information,” Greenfield says. “You don’t know how desirable that will be or when you’re going to get it.”

Greenfield’s survey of 18,000 Web users, conducted in 1999, found that 5.9 percent of them demonstrate the symptoms of being addicted to the Internet. Since that survey, there hasn’t been much comprehensive research on the topic, says Kimberly Young, founder and director of the outpatient Center for Internet Addiction Recovery because there aren’t enough treatment centers from which to acquire comprehensive data.

That’s partly because there remains some skepticism about whether Internet addiction qualifies as a real condition. Greenfield says he’s spent plenty of time trying to convince colleagues that Internet addiction is genuine, and Seattle psychotherapist Hilarie Cash, one of ReSTART’s cofounders, says she often hears from therapists who suggest that the issue isn’t the Internet but whatever anxiety or depression compulsive users are suffering that may lead them to overindulgence. Still, as Cash notes, both China and South Korea have declared Internet addiction their countries’ No. 1 public-health threat.

What is it about this modern invention that crosses the line from entertainment and simple utility to an addiction that can cost people their jobs, their shelter, and even their health? I don’t remember being hooked on Monopoly or checkers, but I do feel something gnawing at me when I’ve let my e-mail inbox collect messages for too long. I spent many a long day in college playing first-person shooter games until my thumbs were sore. For real addicts, there are even more serious medical issues at stake: there have been at least 10 documented cases of people dying from blood clots caused by sitting in front of a computer for too long, Cash says. Cosette Dawna Rae, who cofounded ReSTART with Cash, got a call recently from a woman whose Internet-obsessed son had to have his leg amputated after a clot.

The Web may be dangerous for some people because it can feed or spur existing addictions, making gambling, shopping, and sex readily available to those who have already developed the compulsion to binge on those things. And now that going online is a part of everyday life, it may not be easy to escape the temptation. You can drive past a bar, but it’s hard for many of us to keep a Web browser closed for more than a week at a time. “It’s the delivery system,” Cash says. “It’s available, acceptable, affordable, and accessible.”

The Internet also activates the same pleasure pathways in the brain as drugs and alcohol. As you continue to be rewarded, for completing the next level of a videogame or finding out a new piece of information, the connectors to the limbic system of the brain are stimulated, releasing euphoria-causing dopamine into the body. The brain remembers that happy feeling, encouraging you to keep going back for it.

ReSTART’s clients operate on a 12-step program model, but they also do delayed-gratification tasks that many of us take for granted: they cook, clean, feed animals, and build things. The best way to break from virtual reality, believe the center’s directors, is a healthy dose of actual reality. By checking back in to how normal people live their lives, clients—there have been only been three so far since the center opened earlier this summer—can in theory wean themselves off the constant rush they once got from the Web. Otherwise, it’s daily psychotherapy sessions, to help them understand their addictions and the addictions’ underlying causes.

It’s a difficult problem to treat, says Jerald Block, a clinical psychiatrist at Oregon Health Sciences University who specializes in compulsive computer use. Among the three most common methods are antidepressants, treatment for attention deficit/hyperactivity disorder, and extended retreats from the computer. Cutting off access too suddenly or without other treatment worries Block, he says, because the computer has often become a container for aggression and a major relationship for an addict. Removing those can lead to some very aggressive behavior, including suicide or violence against others, he says: “You’re cutting the way they’re dealing with all of their emotions.”

My brother is outwardly content with his existence, rationalizing it by giving himself pep talks that at least he’s not slaving away for some evil corporation at a wage that’s beneath him. Andrew got his GED and tried several times to get a college degree, but anxiety and depression—two of the underlying symptoms that likely made computers and the Internet an appealing escape—kept him from being able to keep to a structure and conform. My mom had him tested for Asperger’s syndrome once and he showed symptoms but wasn’t diagnosable. He has been treated in the past for depression and anxiety, but says he never found the medications he was prescribed helpful. Is his obsession with the Internet an extension of these illnesses? It’s hard to say, but either way, the effect is real: once he finally gave up on college, he found it nearly impossible to find jobs that paid a livable wage.

But I know that homelessness, financially independent as it may be, will kill my brother someday. When the temperature dips below freezing in January, he’s still in that tent. It can’t be fun pulling those blankets off in the morning and wrestling on his clothes in the middle of an Oregon rainstorm. He’s had his camp burned to the ground by paranoid “neighbors.” Teenagers once beat him up, for no other reason than that he doesn’t live in an apartment. Just this month, somebody set fire to a homeless man in Eugene, and I wondered whether it was Andrew. Perhaps if my brother were to go to ReSTART, he could learn how to reconnect with the real world. Then again, he’s been consumed by computers for most of the past two decades. Maybe he’s a lost cause.

A big question for ReSTART clients is what happens when they leave. The Internet is a nearly inescapable temptation. Do they avoid jobs that involve computers? Vow never to look at a computer screen again? Rae and Cash have developed plans for clients who have graduated. They mostly advise establishing discrete sessions on computers—say, two hours a day—as a way to curb the dangers of the habit. But would you tell a heroin addict he can only smoke a gram a day? And could you keep your own browsing habits to under 120 minutes? “We work with each individual to develop a recovery plan,” Cash says, keeping in contact with them about how to set limits on their personal use of the Internet or, if necessary, avoid it except for work-related tasks.

People who have a job that puts them in front of a computer will obviously have a tougher time regulating Internet compulsion, Greenfield says. The most important step is to be conscious that the behavior’s dangerous; be aware that you may be getting high from it. Then, it’s about changing patterns of use to make sure you’re only using the Web when necessary, “not to medicate yourself because you’re bored, scared, tired, or angry,” he says. “You also have to work to fix the parts of your life that have atrophied due to the use of the Internet. I have cases where lives are essentially shut down because of it.”

Intense therapy may be required for an extended period of time, Cash says, and there is a piece of hardware available at that shuts down the display on PCs once a user has surpassed a preset time limit. The adapter isn’t available for Apple computers yet.

As Cash talks to me at ReSTART, I wonder, as I’ve spent parts of my entire life wondering, whether I could ever wind up like this. I spent hundreds of my childhood hours at an arcade, schooling adolescent competitors at Street Fighter II at the expense of my schoolwork, and during college I discovered a highly addictive site where I could while away my free time playing online spades. There were many 12-hour-long sessions before I finally worked up the nerve to walk away for good. I took a quiz on ReSTART’s Web site, and I don’t qualify as an addict, despite my frequent use of the Net, mostly because I don’t feel truly compelled to stay plugged in and there’s nothing detrimental about my time spent online.

I also wonder how many other people are addicted to the Internet without even knowing it. Research from Greenfield and others suggests that as much as 6 percent of the Internet-using population may have an addiction issue. The quiz is one good way to get an idea whether you have a problem. It’s based on the same methodology as other surveys to detect addiction. If you had to stop checking your e-mail for a week (let’s assume that you didn’t have to do so for work), would it bother you? “People are starting to self-examine,” Rae says. “Do I play too much? What would that look like?”

The next volume of the Diagnostic and Statistical Manual of Mental Disorders, the psychology bible, will likely have a new category for nonsubstance addictions, which has already kicked off a debate among psychiatrists about what should fall under that heading. Does bingeing on social networking qualify as an addiction? That such questions are being asked is a sign that Internet addiction, no longer a wink-and-nudge kind of subject, is being considered a more legitimate disorder in America. Inclusion in the DSM will hopefully lead to increased awareness of the problem and more options for treatment, says Cash. “There aren’t 12-step meetings readily available,” she says. “I’m looking forward to the day when those groups abound.”

So am I, because there’s nothing chuckleworthy about this condition. I did feel a slight rush after I left ReSTART and hopped back online, but I don’t think I’m to the point of needing treatment—yet. My brother, on the other hand, could have used some help a long time ago.

—Winston Ross is an Oregon newspaper reporter. Find him at

Find this article at

Categories: Health Tags:

Students go ‘wild’ with interactive lectures

October 27th, 2009 No comments

Students go ‘wild’ with interactive lectures

Students are being encouraged to go wireless in lectures, using the
University of Hull’s Wireless Interactive Lecture Demonstrator (WILD) to
interact with their subject.

Dr Darren Mundy and his colleagues at the University’s Scarborough School of
Arts and New Media are developing a range of cutting-edge software to
encourage students to be more involved during lectures – even choosing what
they want to learn about.

Using funds from JISC, Dr Mundy has developed WILD Thing, which allows
students to interact in real time while their lecturer delivers a lesson
through PowerPoint. This latest tool means students can annotate lecture
slides and answer questions or ask them, while the lecture is actually being
delivered, using their mobile phones or other wireless devices such as laptops.

David Flanders, JISC rapid innovation programme manager, explained: “What’s
really exciting about the JISC rapid innovation projects, including WILD
thing, is that the tools they are producing not only have the ability to
change the lives of teachers, researchers and students, but also their
potential use by people in business, government and even in the home.  These
tools are giving us glimpses of the future for how technology can continue
to enrich our lives.”

Dr Mundy and his colleagues have also developed a series of ‘Choose Your Own
Lecture’ presentations funded through the Higher Education Academy, allowing
students to select what they will learn during a teaching session.

Dr Mundy said: “We are trying to challenge the traditional methods of ‘chalk
and talk’ where a lecturer delivers a lecture and the students just listen
and take notes. These new methods mean the student becomes a ‘pro-sumer’-
they are not only consuming information but producing it as well.

“We have already done a test run with our students and they really like
being able to interact as the lecture is actually being delivered. It also
offers anonymity within the lecture, so if someone is usually quite shy
about putting up their hand and asking a question in front of a lot of
people, now they can do so via the internet. We are always looking for ways
to involve those students who would never usually participate, so this
technology is really useful,” he added.

WILD Thing will be further tested in lectures this academic year and the
results published next summer, after which Dr Mundy hopes it will be rolled
out for use by lecturers anywhere in the country, as part of JISC’s rapid
innovation programme software catalogue.

Find out more about the WILD thing project at

Categories: Youth Tags:

The 8th Chinese Internet Research Conference

October 27th, 2009 No comments

Call for Papers
The 8th Chinese Internet Research Conference
School of Journalism and Communication, Peking University, China
June 29-30, 2010

Internet and Modernity with Chinese Characteristics: Institutions, Cultures and Social Formations

By June 30 2009, the number of netizens in China has reached 338 million, surpassing the total population of the United States. Already the country with the largest number of Internet users since 2008, Chinese Internet now boasts of 2.1million websites and more than 100 million blogs. The fast changing landscape of Internet usage in China has seen both quantitative and qualitative developments. In fact, the visions and thrills of getting online parallel China’s ambition to build a modern society with Chinese characteristics. The Internet has penetrated into social institutions, political processes, cultural activities and people’s everyday life. It is time we look beyond numbers and events and delve deeper into the fabric of China’s social life in order to understand how the Internet integrates, counteracts or cooperates with institutional, cultural and social forces in seeking and creating a modern form of existence. The theme of the 8th Chinese
Internet Research Conference, “Internet and Modernity with Chinese Characteristics: Institutions, Cultures and Social Formations,” is designed to bring together scholars, experts and leaders in the field to explore these fascinating developments and trends.

This will be the very first time this conference is held in mainland China. We aim to open a forum where different perspectives and expectations meet, communicate and interact, and where the agendas and hopes of the Chinese population are heard, discussed and analyzed on an international scale. The working languages of the conference will be both English and Chinese, and we will provide translation service if necessary. The forms and contradictions in which China tries to conceptualize and materialize modernity, and how the Internet is helping out in this process are the main focus of this conference. Specific topics include but are not limited to:

Mediations between the global and the local
The political economy of the Internet and information industry in China
Internet as alternative media
Virtual communities and identity formation
Social relations on the Internet

With these topics, we hope to explore the following questions: How global influences and local initiatives meet, interact and converge through and about the Internet? Is Internet the agent of globalization and homogeneity or that of local independence and particularity? With increasing capitalization and commercialization of the Internet in China, what is the dynamic relationship among state, market and civil society? What creative uses of the Internet are made by Chinese netizens? Does the Internet serve as an alternative media that meets new challenges of a society in transformation? What are the different relationships that exist between the Internet and the so-called traditional media? How do people interact and form groups through the Internet and other information technologies? How do various virtual communities operate? What is the significance of virtual communities in the formation of identities and values for different social groups in China?
What power relations are maintained, challenged or undermined by the Internet? What institutional, social and cultural experiments are being conducted on the Internet that can enlighten our imagination of a just form of social organization?

Paper Submission
We welcome proposals of quantitative, qualitative and critical studies from all disciplines. Panel proposals are welcome too. Both English and Chinese proposals will be considered. We assume that the language of proposal will also be the language of presentation.
Preference will be given to papers and panels that significantly advance understanding of the role and impact of the Internet and associated technologies in China, including advancing theoretical understandings, methodological approaches, and sophistication of analysis.
A proposal of approximately 1000 English words or Chinese characters is due by Jan. 15, 2010. Submissions should be sent to Dr. Wang Weijia and Dr. Wang Xiuli at Accepted papers will be announced on February 15, 2010. Completed papers should be submitted by April 24, 2010.

Categories: call for papers Tags:

Digital Discrimination: The New Racism?

October 27th, 2009 No comments

Digital Discrimination: The New Racism?

Euphoric promises of a cyberspace utopia and digital democracy notwithstanding, the Net is saturated with racial ideologies, says Rayvon Fouché.

by Christina Jeng | 02.27.2004 ReadMe 4.3 | Print it.

“Ours is a world that is both everywhere and nowhere, but it is not where bodies live. We are creating a world that all may enter without privilege or prejudice accorded by race, economic power…” Blah, blah, blabbity, blah, blah.

So wrote John Perry Barlow, dubbed cyberspace’s Thomas Jefferson by Yahoo Internet Life Magazine, in his “A Declaration of the Independence of Cyberspace,” a 1996 manifesto circulated through e-mail and posted on thousands of sites. And while assistant science and technology professor at Rensselaer Polytechnic Institute Rayvon Fouché appreciates Barlow’s “neo-utopian” view, he contends, “it’s impossible to create any world devoid of the powerful social and culture factors of race, gender, and class.”

To Fouché, the Net is a technology saturated with racial ideologies. In his recently published book,Black Inventors in the Age of Segregation: Granville T. Woods, Lewis H. Latimer, and Shelby J. Davidson, he examines the relationship between race and technology. He divides technology into three parts: the physical material it’s made of, the way it’s used, and the knowledge or ideas that drive its design and production. In Black Inventors, he looks at how three prominent 20th century black inventors struggled to contribute to the history of technological innovation during a period of escalating racial tensions. In Fouché’s opinion, the same factors that influenced who could invent during the early 1900’s—race, class, and gender—still affect the field of engineering and other spheres of technological innovation today.
The Internet, says Fouché, is not as race- or class-free as cyber-utopians such as Barlow once thought it was. Sure, says Fouché, he can log onto the Net as a 50-year-old South Asian woman and no one would be the wiser. Or he could sign on as the African-American he really is, or morph into a single Latina mother, or how about a teenage valley girl with a crush on Josh HotNet? True, he has the freedom “to express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity…” (Barlow). The notion that digital media such as the weblog provide a publishing platform for the masses has inspired flights of Net pundit idealism.

Nonetheless, as Fouché points out, “You have to have access to a computer” to join Barlow’s online utopia, and race and class have proven to be barriers to getting wired. According to a 1999 U.S. Department of Commerce study, “Black households […] continue to trail white households in their access to computers and the Internet.” And in a 2003 Pew Internet and American Life report, researchers say “being white is a strong predictor of whether a person is online, controlling for all the other demographic variables” such as having a college degree, being a student, being employed, and having a comfortable household income. This is where Barlow’s manifesto and the promises of overly optimistic Net promoters like him start to break down.

That said, a 1999 Cyber Dialogue survey found that 4.9 million African-American adults were online, more than any other U.S. minority group. However, while that figure represents 28% of the black population in America, it’s still a smaller piece of the demographic pie than the 37% of adult whites who were online in ‘99.

Furthermore, the growth of the African-American population online has largely resulted, not in the creation of a color-blind utopia, but in the targeting of blacks as consumers.

Take America Online: According to Target Market News, AOL commissioned a national survey through Digital Marketing Services, Inc. (DMS) and found African-Americans to be “active online consumers, who respond more to online offerings and purchase more clothing and music online than the general online population.” Tapping into this consumer base, AOL recently purchased, one of the largest online African-American communities. Target Market News reports that AOL plans to aggressively develop an African-American strategy that will involve its Africana.comsite as well.

Buyer Beware

Targeting African-Americans for consumer purposes isn’t just a Net thing. According to Fouché, it permeates technological culture. It’s a common misconception, he claims, that black people are best suited to consuming, and that they only use technologies, rather than creating them.

Worse yet, argues Fouché, since our technology is created by and for white males, blacks and other ethnic minorities are left to passively consume the products of this mindset, such as violent, hypermasculine videogames that offer players fantasies of domination and power—games like Half-LifeMedal of Honor, and Return to Castle Wolfenstein.

“It’s very infrequent that…corporations come into the black community saying ‘So, what are your needs?,’” says Fouché. “There [must] be more black engineers and designers that will say, ‘Well, what about my people?’”

The answer, for Fouché, is to get more African-Americans into technical institutes such as RPI, Massachusetts Institute of Technology, and California Institute of Technology, schools he hopes will produce the next, racially diverse generation of scientists and engineers.

“For the last decade, the black student population [at RPI] has been about 4 percent, never getting higher or any lower,” Fouché laments.

But why push African-Americans into fields and institutions he believes are built on a Western mindset fraught with racist assumptioms?

“It’s impossible to extract oneself from the influences of Western culture,” says Fouché, “but to be in a position to make counter-hegemonic responses to your oppressive condition, you have to first deeply understand the system that oppresses you.” Then and only then, he asserts, can you “make technological decisions based on a set of priorities (racial, ethnic, cultural or otherwise).”

Racism? What Racism?

For the most part, says Fouché, “the black community doesn’t see technology as an ideological force affecting their lives.” They easily recognize racial representations like the minstrel show, he notes, but as we move into the digital realm, racial ideologies and even the loss of black culture are not so easily recognizable.

In an ongoing project that examines the shift from analog to digital technology, Fouché looks at the hip-hop art of “scratching,” a black cultural practice popular in New York City during the late 70’s and early 80’s. Turntables were originally an analog technology, “but once you go from analog turntables and vinyl [records] to digital turntables and cd-roms, the cultural practice is condensed into algorithms,” he says. Scratching on a digital turntable is based not on the artist’s gestures, notes Fouché, but on a software programmer’s representation of what scratching sounds like.

“That’s where it gets very scary,” he says. “This cultural practice that has a long tradition is reduced to lines of code, and by reducing it to lines of code, you lose the people and you lose the black culture.”

Fortunately, technology is redeemable. Recently, Fouché also co-edited, with cultural critic and Associate Professor of Science and Technology Studies at RPI Ron Eglash,Appropriating Technology: Vernacular Science and Social Power, an anthology of essays that examine the ways in which “outsiders,” such as Latinos, blacks, homosexuals, and women, reinvent consumer products, from low-rider cars to turntables to cell phones, and thereby “defy the notion that they are merely passive recipients of technological products.”
In his introduction, Eglash describes how Native American artist Sharol Graves, for example, reinvented CAD/CAM software, originally intended for computer circuit design, and used it for herIndian design drawings. Eglash quotes Graves as saying, “I wanted the public to know that a Native American was working in the research and development of high technology, just to blow a few stereotypes about the ‘Indian Mind.’” Appropriating technology, as Graves does, yields strategies for strengthening cultural identity, argues Eglash.

Says Fouché, “The revolution never comes, just a migration of power.”


Black Online Population Narrows Adoption Gap

Technology Versus African-Americans

“Say It Loud, I’m Black and I’m Proud”: African Americans and Vernacular Technological Creativity

The Race for Cyberspace: Information Technology in the Black Diaspora

Christina Jeng is a freelance journalist, “Tech News” editor of ReadMe, and contributing writer for The Washington Square News.

Categories: Uncategorized Tags:

The Development Gateway Foundation

October 27th, 2009 No comments

The Development Gateway Foundation provides Web-based tools to make aid and development efforts more effective. It offers innovative solutions that increase access to critical information, building local capacity and bringing partners together for positive change. The Development Gateway Foundation is a nonprofit organization with activities around the world.

Categories: Organisations Tags:

Information Economy Report 2009

October 26th, 2009 No comments〈=1&mode=highlights

The Information Economy Report 2009: Trends and Outlook in Turbulent Times is the fourth in a series published by the United Nations Conference on Trade and Development (UNCTAD). The report is one of the few publications to monitor global trends in information and communication technologies (ICTs) as they affect developing countries. It serves as a valuable reference for policymakers in those nations. It gives special attention to the impact of the global financial crisis on ICTs.


  • Global and regional trends in the diffusion of ICTs such as fixed and mobile telecommunications, Internet, and broadband
  • Ranking of the most dynamic economies in terms of increased ICT connectivity between 2003 and 2008
  • Monitoring of the “digital divide”
  • Survey of national statistical offices on the use of ICT in the business sector
  • A review of the changing patterns in the trade of ICT goods
  • A mapping of the new geography in the offshoring of IT and ICT-enabled services.
  • Policy recommendations on how developing countries can reap greater benefits from ICT
  • A statistical annex with global ICT data.

The Information Economy Report 2009 (IER 2009) offers a fresh assessment of the diffusion of key ICT applications between 2003 and 2008. While fixed telephone subscriptions are now in slight decline, mobile and Internet use continues to expand rapidly in most countries and regions. At the same time, there is a widening gap between high-income and low-income countries in broadband connectivity. Broadband penetration is now eight times higher in developed than in developing countries. The report explores policy options for countries seeking to improve broadband connectivity.

The IER 2009 includes a chapter on the use of ICTs in the business sector. Drawing on unique data, it examines how ICT use differs both between and within countries, highlighting the rural-urban divide as well as that between large and small companies. The report recommends that governments in developing countries give more attention to ICT uptake and use by small- and medium-sized enterprises (SMEs), as they are lagging behind larger firms. And it discusses those aspects of ICT where government intervention can make a difference.

A third chapter is devoted to the impact of the financial crisis on ICT trade. While a growing share of exports of ICT goods and services is accounted for by developing economies, especially in Asia, the crisis has affected goods and services quite differently. ICT goods are among the categories of trade most negatively affected by the recession, while IT and ICT-related services appear to be among the most resilient. A statistical annex to the report provides data on ICT infrastructure, ICT use, and ICT trade for up to 200 economies. A PDF version of the IER 2009 and its statistical annex are downloadable from the UNCTAD website ( from 23 October 2009.

For more information about UNCTAD’s work on ICT for Development please contact:
ICT Analysis Section
Telephone: +41 22 917 55 91
Fax: +41 22 917 00 52

Is technology producing a decline in critical thinking and analysis?

October 26th, 2009 No comments

As technology has played a bigger role in our lives, our skills in critical thinking and analysis have declined, while our visual skills have improved, according to research by Patricia Greenfield, UCLA distinguished professor of psychology and director of the Children’s Digital Media Center, Los Angeles.

Learners have changed as a result of their exposure to technology, says Greenfield, who analyzed more than 50 studies on learning and technology, including research on multi-tasking and the use of computers, the Internet and video games. Her research was published this month in the journal Science.

Reading for pleasure, which has declined among young people in recent decades, enhances thinking and engages the imagination in a way that visual media such as video games and television do not, Greenfield said.

How much should schools use new media, versus older techniques such as reading and classroom discussion?

“No one medium is good for everything,” Greenfield said. “If we want to develop a variety of skills, we need a balanced media diet. Each medium has costs and benefits in terms of what skills each develops.”

Schools should make more effort to test students using visual media, she said, by asking them to prepare PowerPoint presentations, for example.

“As students spend more time with visual media and less time with print, evaluation methods that include visual media will give a better picture of what they actually know,” said Greenfield, who has been using films in her classes since the 1970s.

“By using more visual media, students will process information better,” she said. “However, most visual media are real-time media that do not allow time for reflection, analysis or imagination — those do not get developed by real-time media such as television or video games. Technology is not a panacea in education, because of the skills that are being lost.

“Studies show that reading develops imagination, induction, reflection and critical thinking, as well as vocabulary,” Greenfield said. “Reading for pleasure is the key to developing these skills. Students today have more visual literacy and less print literacy. Many students do not read for pleasure and have not for decades.”

Parents should encourage their children to read and should read to their young children, she said.

Among the studies Greenfield analyzed was a classroom study showing that students who were given access to the Internet during class and were encouraged to use it during lectures did not process what the speaker said as well as students who did not have Internet access. When students were tested after class lectures, those who did not have Internet access performed better than those who did.

“Wiring classrooms for Internet access does not enhance learning,” Greenfield said.

Another study Greenfield analyzed found that college students who watched “CNN Headline News” with just the news anchor on screen and without the “news crawl” across the bottom of the screen remembered significantly more facts from the televised broadcast than those who watched it with the distraction of the crawling text and with additional stock market and weather information on the screen.

These and other studies show that multi-tasking “prevents people from getting a deeper understanding of information,” Greenfield said.

Yet, for certain tasks, divided attention is important, she added.

“If you’re a pilot, you need to be able to monitor multiple instruments at the same time. If you’re a cab driver, you need to pay attention to multiple events at the same time. If you’re in the military, you need to multi-task too,” she said. “On the other hand, if you’re trying to solve a complex problem, you need sustained concentration. If you are doing a task that requires deep and sustained thought, multi-tasking is detrimental.”

Do video games strengthen skill in multi-tasking?

New Zealand researcher Paul Kearney measured multi-tasking and found that people who played a realistic video game before engaging in a military computer simulation showed a significant improvement in their ability to multi-task, compared with people in a control group who did not play the video game. In the simulation, the player operates a weapons console, locates targets and reacts quickly to events.

Greenfield wonders, however, whether the tasks in the simulation could have been performed better if done alone.

More than 85 percent of video games contain violence, one study found, and multiple studies of violent media games have shown that they can produce many negative effects, including aggressive behavior and desensitization to real-life violence, Greenfield said in summarizing the findings.

In another study, video game skills were a better predictor of surgeons’ success in performing laparoscopic surgery than actual laparoscopic surgery experience. In laparoscopic surgery, a surgeon makes a small incision in a patient and inserts a viewing tube with a small camera. The surgeon examines internal organs on a video monitor connected to the tube and can use the viewing tube to guide the surgery.

“Video game skill predicted laparoscopic surgery skills,” Greenfield said. “The best video game players made 47 percent fewer errors and performed 39 percent faster in laparoscopic tasks than the worst video game players.”

Visual intelligence has been rising globally for 50 years, Greenfield said. In 1942, people’s visual performance, as measured by a visual intelligence test known as Raven’s Progressive Matrices, went steadily down with age and declined substantially from age 25 to 65. By 1992, there was a much less significant age-related disparity in visual intelligence, Greenfield said.

“In a 1992 study, visual IQ stayed almost flat from age 25 to 65,” she said.

Greenfield believes much of this change is related to our increased use of technology, as well as other factors, including increased levels of formal education, improved nutrition, smaller families and increased societal complexity.

Categories: Uncategorized Tags: