Upload new images. The image library for this site will open in a new window.
Upload new documents. The document library for this site will open in a new window.
Show web part zones on the page. Web parts can be added to display dynamic content such as calendars or photo galleries.
Choose between different arrangements of page sections. Page layouts can be changed even after content has been added.
Move this whole section down, swapping places with the section below it.
Check for and fix problems in the body text. Text pasted in from other sources may contain malformed HTML which the code cleaner will remove.
Accordion feature turned off, click to turn on.
Accordion featurd turned on, click to turn off.
Change the way the image is cropped for this page layout.
Cycle through size options for this image or video.
Align the media panel to the right/left in this section.
Open the image pane in this body section. Click in the image pane to select an image from the image library.
Open the video pane in this body section. Click in the video pane to embed a video. Click ? for step-by-step instructions.
Remove the image from the media panel. This does not delete the image from the library.
Remove the video from the media panel.
June 26, 2015--The recent human-computer romance movie Her and the 1940s-era I, Robot series of short stories may have seemed far-fetched to audiences, but, according to philosophers who have considered the issue, similar situations may not be far in our future.
“This is real work that’s occurring right now,” said John P. Sullins, introducing a discussion titled “Sex, Virtue and Robots” at a conference this week at the University of Delaware. “It’s not science fiction.”
Sullins, an ethics professor at Sonoma State University, was referring to a team that is working to add artificial intelligence to the RealDoll product, a customizable, life-size sex doll that The New York Times says has sold more than 5,000 units since 1996. Sullins showed the audience at the UD conference a brief video in which RealDoll creator Matt McMullen demonstrates the work of his team of robotics engineers as they seek to make the doll increasingly lifelike, with a stated goal of creating a “genuine bond between man and machine” through emotional and not just physical connections.
Sullins’ talk and a related panel discussion were part of a four-day conference, “Computer Ethics: Philosophical Enquiry,” that took place at Clayton Hall Conference Center on UD’s Newark campus. It was the first international conference jointly sponsored by the International Society for Ethics and Information Technology and the International Association for Computing and Philosophy.
In the “Sex, Virtue and Robots” session, panelists and members of the audience raised concerns about a possible future in which highly realistic “sexbots” are widely available. For example, asked Charles Ess of the University of Oslo, if humans have sexual relations with robots do they become more like robots themselves?
“Robots can imitate love,” Ess said. “We can build robots that can trick us into thinking they care for us. But it is a trick.”
Panelist Shannon Vallor of Santa Clara University questioned whether the use of sexbots would make it more difficult for people to have emotionally mature human relationships and possibly undermine the ability to develop such virtues as empathy, patience and caring for others.
In response to a comment from the audience, Ess said the use of such robots would not necessarily be bad in all situations and could, for example, possibly replace the exploitation that now occurs with sex workers.
“It’s not science fiction anymore,” he said. “We need to be aware of the possible problems as well as the possible benefits.”
Other topics at the conference included ethical issues involved in big data, personal health technologies, battlefield robots, tracking devices and self-driving cars.
In one of several keynote addresses, “Getting a Handle on Big Data Ethics,” Deborah Johnson, professor of applied ethics at the University of Virginia, discussed recent research in which Facebook collaborated with scientists at Cornell University to manipulate and analyze the emotions of the site’s users.
That controversial research leads to other concerns about “the futuristic, ultimate consequences of unbridled big-data analytics,” Johnson said. Some marketing experts, she said, use the term “neuromarketing” as a process of analyzing and predicting consumers’ behavior based solely on what they’ve done in the past, without asking them their opinions or preferences.
She called for greater public accountability for social media and other data-collecting companies.
The conference at UD was organized by Thomas Powers, associate professor of philosophy and director of UD’s Center for Science, Ethics and Public Policy.
Powers, who also has an appointment in the School of Public Policy and Administration and at the Delaware Biotechnology Institute, will be working during fall semester and Winter Session with Jean-Gabriel Ganascia at a Sorbonne university in Paris.
Ganascia specializes in informatics, and Powers will collaborate with researchers in computer science, philosophy and machine learning on a project titled “Autonomous Agents and Ethics,” sponsored by the French National Research Agency.
Article by Ann Manser
Photo by Kathy F. Atkinson
--From the UDaily
Move this whole section up, swapping places with the section above it.