New Majors Introduced to Prepare Students for the Future of Technology

By Allyson Roberts

As a teen in Richboro, Pa., Chris Hahn worked on a farm to save the money he earned for a much-desired purchase: his own computer.
Brain Circuit
A few years later when Hahn started at Widener in 1996, he was among the minority of students who brought a computer into his dorm room. The computer science major was thrilled when he could hook into a network from his room and didn’t have to use telephone lines to dial up the Internet. “I had a faster connection in my room at school than I did at home, so I wanted to be at school all the time,” said Hahn, 35, a technology entrepreneur in Seattle who has worked for Microsoft.

Today, George "Tyler" Romasco, a 21-year-old senior computer science major from Abington, Pa., wouldn't dream of coming to campus without a computer. He got his first cell phone at age 11 and his own laptop two years later. By the time he headed off to college, he brought a host of other gadgets that had become "necessary" for everyday life: a television, a PlayStation 3, and an iPod in addition to his laptop and "dumb" cell phone. He's since upgraded to a smartphone.

As someone who did not come of age in the computer age, Dr. Dwight DeWerth-Pallmeyer, director of communication studies at Widener, appreciates the major technological developments over the past three decades. "In the '80s, when I first started teaching, nobody used a computer unless they had to," he said. "It's amazing that now everyone carries one in their pocket. As a child, the closest thing I had to today's smartphone was the wrist watch video phone that Dick Tracy wore. Now, we do that for real with Face Time and Skype."

The Future of Technology?

While the rapid advancements in tech­nology have been astounding, what does the future hold? How will the world of computing impact our lives in the year 2021—the year when Widener University turns 200—and beyond? Students, faculty, and alumni interviewed predict a vastly different world, one where computers and humans are intertwined even more tightly than they are now.

After reading a Time magazine cover story in February 2011 entitled "2045: The Year Man Becomes Immortal," DeWerth-Pallmeyer became enthralled with the idea of singularity. He even made it the focus of recent research endeavors. Singularity refers to "the transformation of our species into something that is no longer recognizable to humanity at this present time." The transformation is a result of a moment in time when computers become more intelligent than the humans who built them. For Raymond Kurzweil, an American author, inventor, and futurist who follows closely the idea of technological singularity, it will occur around 2045. "Kurzweil believes in this idea so much that he froze his father after he died," said DeWerth-Pallmeyer. Like Kurzweil, who surmises some good will come from technological singularity, DeWerth-Pallmeyer is a utopian.

Those with opposing views, dystopians, view singularity as the demise of humanity. Dr. Adam Fischbach, assistant professor of computer science, falls into a third category of "nonbelievers." He said he finds it rather convenient that those predicting singularity have scheduled it to occur in their lifetime.

Looking into the more immediate future, Fischbach said prominent examples exist of how computers are gaining human skills, such as speech recognition software now available on iPhones, and the supercomputer named Watson who beat human contestants on the TV show Jeopardy last year. "Watson showed us on Jeopardy how computers can not only identify words, but also interpret their meaning," he said.

Fishbach foresees computers taking on more abilities once considered exclusively human. "Technology is also being developed to interpret our brain waves," he said. "When I was a kid, the sinister brain waves scanner featured in G.I. Joe was fiction. Today, it's becoming reality."

Student Romasco went a step further, proposing that we could see chips implanted in our skin or small circuits in contact lenses as part of a new user experience. "Communication will require us not to pull a device out of our pockets, but to merely think about it," Romasco said.

Romasco, Hahn, and Dr. Suk-Chung Yoon, chair of the computer science program, all forecast advancements and more prominence of cloud computing—the technology where instead of data being stored on a single computer it is held on an Internet "cloud" that can be accessed by multiple systems.

Hahn predicts that advancements in cloud computing will also allow for large amounts of data to be analyzed at once to reveal trends in healthcare, economies, weather patterns, and more. "The old model of computing is gone, and the new model is here," Hahn said.

Enter Informatics

Regardless of how the technological future plays out, Widener administrators and faculty have developed curricula and established new majors to prepare students for the coming new world. Beginning in fall 2012, students will be able to major in either media informatics or business informatics, and a new healthcare informatics concentration is set to launch next year.

Widener defines "informatics" as the interdependent relationships among humans, computers, and media systems. It's a relatively new concept in the United States—about 15 years old—and its vague definition isn't without purpose as the term can apply to any technology that affects humans and vice versa. A person with an informatics degree may develop an application to engage a business's consumer base or analyze digital health records to plan an educational outreach program for a specific population.

The goal of the new majors is to give students a general understanding of computers, including basic programming, and partner that with knowledge in a specific industry. "Students will enter the workforce with an IT background, but also knowledge in the field they are working in so that they can become problem solvers and innovators," Yoon said.

While alumnus Hahn doesn't use the term informatics to describe his own business, his operation is a prime example of informatics at work. In 2007, Hahn co-founded the Seattle-based company Appature, which he describes as one of the world's first companies to handle the entire problem of marketing with a single product. "We handle everything from data mining—sucking in every piece of information for a company—to taking action with e-mails and surveys and creating automated programs to keep customers engaged," he said. "We even derive insight into what marketing activities do and do not work using best-in-class analytics tools."

He's using his groundbreaking technology to first carve a niche in healthcare, currently supporting about 25 pharmaceutical brands, before moving on to other industries. In just five years, he's gone from two to nearly 60 employees, and he's done this using the principles of informatics. He's created technology that reaches and influences customers while also using the customer to shape future marketing efforts.

This ever-changing environment in which Hahn has succeeded is one that Widener faculty onitor and use to bring real-world lessons into the classroom. "I try to find the next big idea so that I can adapt curriculum to ensure that Widener students know about it," Yoon said.

 

###