THANK YOU FOR SUBSCRIBING
Cutting-edge information technologies have significantly advanced the ability of computational systems to analyze massive amounts of data to find trends, correlations, and relationships in large datasets. At Case Western Reserve University, we see these advanced data analysis capabilities improving what humans can do, not replacing the human. We strive to expand human capability through connection with information technology. Accordingly, we take an approach that puts humans at the center and considers specifically what aspects of humanity are essential for the successful development of new digital technologies. With a focus on what is fundamentally human, we can think more deeply about the ethical, legal, and social dimensions of digital innovation—including artificial intelligence (AI). This is true across applications, whether in healthcare, manufacturing, energy, or any other field. Digital technologies should not replace humans but should expand human capability and potential.
How does this play out in a university research setting? First, we deliberately assemble transdisciplinary teams around a convergent research approach to think about the technology need, and how to develop it ethically in ways that benefit society. Next, we engage deeply with the stakeholders who would benefit from the technology. To illustrate the kinds of projects emerging from this approach, consider the following examples:
Dr. Kiju Lee uses robotics and AI to help assess and treat neurological disorders, such as autism and Alzheimer’s disease. Her robots recognize facial expressions and emotions with great accuracy and can be used to supplement one-one therapy for some patients and even to stand in for a human therapist when patients live far from a healthcare facility.
"It takes both people and technology to advance digital technologies effectively and with humans at the center"
Dr. Ken Loparo helps companies use machines and AI to improve safety and increase productivity in manufacturing. His work is not about replacing workers but instead seeks to use machines to increase human effectiveness. Using AI and automation, he helps companies improve factory-floor operations and maintenance.
Dr. Anant Madabhushi develops AI algorithms that can “read” radiology and pathology images with greater speed, accuracy, and specificity than humans. However, the point is not to remove physicians from decision-making. Instead, these tools provide decision support so that physicians and patients, empowered with data, can make better decisions that will more likely result in improved health outcomes.
Dr. Dustin Tyler creates human neural systems to provide the sense of touch and motion. With this technology, he makes prosthetics that allow an amputee to “feel” by sending touch signals to the brain. His work in the field of the human-machine interface is fundamentally changing the way we think about prosthetics and, by extension, machines—not merely as tools to be appended to a patient but rather as technologies that become part of the patient.
What does all of this mean for a CIO? They have to lead effectively and assure both the organization and the infrastructure are ready, secure, and available for this kind of research to happen. The CIO and their staff need to have strategic relationships with faculty and should be involved in the technology, providing services (such as data security), so the researchers can concentrate on their work. A good CIO can anticipate the needs of research teams by serving as a bridge between the laboratory and the university’s hardware and software infrastructure. It takes both people and technology to advance digital technologies effectively and with humans at the center.