• English
  • English

Events

Tag: Health AI

U of A researchers develop tool to help build better prosthetic limbs

(Photo: Getty Images)

Neuroscientists look into the complex physical and mental co-ordination needed for seemingly simple movements.

Prosthetic users have to look longer at the object they are interacting with than their able-bodied counterparts, according to University of Alberta research that illustrates just one of the intricacies involved in devising the next generation of prosthetic limbs.

“There are prosthetic devices becoming available that are almost indistinguishable from real limbs, but the real problem is, if you think about how many different ways you can move your hand, each one of those would need a separate channel of information,” said Craig Chapman, a U of A movement neuroscientist in the Faculty of Kinesiology, Sport, and Recreation.

“They’ve engineered these beautiful hands, but it’s very difficult to control them.”

Chapman and his team, which includes Jacqueline Hebert, a physical medicine and rehabilitation specialist in the Faculty of Medicine & Dentistry working to bring osseointegration surgery—the permanent anchorage of artificial limbs to the human skeleton—to the U of A, are interested in the kinds of movement decisions and the numerous computations we don’t think about when we reach out to do something as simple as grabbing a doorknob to open the door. 

To help dissect the processes behind each movement, Chapman mainly uses motion and eye tracking to gather data needed to understand what’s going on inside the brain.

“We thought, ‘Hey, if we return a sense of touch to people, maybe that’s the first thing that will be freed up—maybe they won’t necessarily move differently, but maybe their eyes will tell us a story about how much extra information they can process,’” said Chapman, who is also a member of the Neuroscience and Mental Health Institute. “That’s really the hypothesis we’ve been chasing for about five years now.”

Chapman and his team devised a tool called Gaze and Movement Assessment, or GaMA, as a way to track both body and eye movements, and put it all into a meaningful three-dimensional space.

Users are fitted with a head-mounted eye tracker that fits like a pair of glasses. At the same time, motion capture markers are placed on the upper limb being tracked, as well as on any other body parts of interest, like the head or torso.

They are then asked to perform two simple tasks that mimic chores prosthetic limb users would encounter in the real world. One is grabbing a box of Kraft Dinner and then moving it to three different shelf positions. A second has subjects moving around a cup filled with beads.

“And while they sound like simple tasks, because they were designed with a clinician and occupational therapist, they challenge prosthetic users in unique ways,” said Chapman.

“Getting them to do the movement consistently is what allows us to look at averages and determine what part of a particular movement is so difficult.”

Measures of hand movement, angular joint kinematics and eye gaze were compared with those from a different sampling of non-disabled adults who had previously performed the same protocol with different technology.

The research showed that the prosthetic limb user will continue to look at the device and the object, whereas able-bodied individuals look ahead to where they’re going to put it down.

“Their eyes are free to go to the next place and start planning a successful movement,” he said.

Chapman said his studies show that participants will often overcompensate to complete the task. For example, users of a body-powered prosthetic—a cable-driven device that allows the user to open or close the device using different body motions—put extra strain on their shoulder and trunk because they have limited degrees of freedom at the wrist.

“They will adapt their body to finish the movement, but maybe they’re doing it in a way that might eventually cause some sort of fatigue injury.”

He added, “If they’ve been able to successfully navigate their world and do the things they want to do for everyday living, it’s possible that an advanced prosthetic limb will actually interfere with that, and you just don’t know.”

Chapman noted that the broader impact outside of prosthetic limbs for GaMA is that it could help fill knowledge gaps in any number of sensory motor impairments.

“If you imagine someone who’s developing a tremor because they have Parkinson’s, had a stroke or are aging, and is learning to recover their motor function … we think we can actually tap into some underlying mechanisms to find out what precisely it is that they’re having an impairment with.”

Aris MD can turn MRIs and CT scans into 3D images

The co-founders of Aris MD, CEO Chandra Devam and CTO Scott Edgar, met at summer camp as eleven-year-old wunderkinds and bonded over their love of science fiction. As they grew up, Devam, ever the entrepreneur, hired Edgar to install carpets for one of her first businesses. Devam became a model and real estate investor, while Edgar studied Artificial Intelligence at Stanford. Years later, when Devam had an idea for a company that required a visionary technology developer, she lured Edgar from his job. They sold that company privately. Subsequent to this sale, Edgar went on to become the brains behind Dryft, which quickly sold to Apple. “If you have an iPhone and you use the keyboard on that phone, I did a lot of work on patented technology that drives that keyboard,” said Edgar.

Aris can turn any scan, of humans, or machines, into a 3D space one can navigate with VR, AR, or PC. ARIS MD

Several years later, Devam was having minor surgery when a surgeon nicked an artery and endangered her life. Devam wondered, given all the advances in imaging technology, why there is so much guesswork involved in surgery. She and Edgar soon discovered there was a real need for a product that could turn a CT scan into, effectively, X-ray vision. Using computer vision and augmented reality glasses, the scans can be anchored to the human body. When viewed by a clinician wearing the XR device, the body reveals its intricate layers. There’s a lot of computation and different technologies in the stack of technology Edgar created, but x-ray vision is a pretty good metaphor for the lay reader.

“Medical error, and specifically avoidable medical mistakes are the third leading cause of death in the United States which is possibly the most medically advanced country in the world,” Edgar said at NASA Ignite at SXSW in March. “Did you know that your heart, my heart, your lungs, my lungs, they’re all in different places. We are as different on the inside as we are on the outside. This is pretty obvious if you think about it. I mean you and I look very different. There’s no way that our organs could be in the same places and there’s no way for doctors to understand that easily at this point in time. So, right now when they look at medical imaging, be that CT, MRI, any sort of imaging, what they do is they look at it in slices. So, they look at it like flipping through pictures in a picture book and they try to composite those images in 3D in their mind. This is a huge burden on them and requires a lot of training to do this. We take the images and put them in 3D so that they can view them organically. They can view them for diagnostics. They can view them for surgical planning and using augmented reality we can actually project those images on the patient so they can see them live during the procedure. So, they can see exactly where something like a tumor is without actually having to cut into you. This allows procedures to be much safer, much faster, and much much more effective.” Aris won Ignite the Night, and was invited to participate in NASA’s larger, more prestigious competition, the 2019 NASA iTech-Cycle I, which the company won two weeks ago.

Aris MD co-founders Scott Edgar and Chandra Devam have been friends since they met at a camp for gifted children when they were eleven years old.  ARIS MD

“It was just mind-blowing,” Devam says of the NASA honor. “I could not be prouder of what we’ve done here.” The NASA competition started three years ago to find some of the best emerging private technology from across North America. NASA hopes to incorporate this new tech into its various research programs, including its goal of future space exploration.

It is hard for a small company to break into the medical field. The US market is especially difficult as it is unclear who pays for new technology and the training necessary to harness its powers. Nonetheless, with international pilots in the UK and the United States. Aris was able to raise $2M in seed capital from angel investors and has continued to make progress despite its small staff and modest funding. The NASA win put a rocket under the company, and has taken them to the heart of the aerospace industry, someplace they never thought they’d be.

“Our new friends and advisors immediately realized that Aris’ technology should work with industrial CT scanners,” explained Devam. “A technician can look inside a rocket engine on the launchpad without taking it apart.” Testing quickly revealed that Aris’ technology is indeed very effective. “People are calling it a breakthrough,” said Devam. “Because it is.”

Now that the word is starting to get out, Aris suddenly has new deal flow from every direction, not just the medical field. Aris is looking at licensing its technology for applications like inspecting infrastructure such as pipelines, construction, automotive, reactors, airplanes, and for structural integrity. “Pretty much anything that has been constructed, we could scan,” says Devam.

Aris MD can turn MRIs and CT scans into 3D images, allowing surgeons to literally look inside the body. ARIS MD

Still, says Devam, our hearts are in helping people. “As a business, we save our partners time and money, but in terms of patient care most of all what we’re doing is saving lives,” she said. “Hand held scanners can create a 3D environment in which a remote doctor can work on a real patient. We are just at the beginning of discovering what this technology can do to help humankind.”

Edmonton surgeons create cyborg tech to better transplant lungs and hearts

Dr. Darren Freed with a prototype ex vivo organ perfusion machine that was keeping swine lungs viable. The goal is to develop a better method to transport organs between donor and recipient on July 23, 2019. Photo by Shaughn Butts / Postmedia SHAUGHN BUTTS /POSTMEDIA

An inspirational story he heard as a child, a hot-rod car he loved as young man, and a desire to do something important with his life all helped propel Edmonton transplant surgeon Darren Freed to develop a transformative technology in organ transplantation.

Freed and his team at Tevosol, an Edmonton medical technology company, have built the prototype of a cyborg device that mimics the human body so that lungs can more easily survive outside the human body and be transplanted in excellent condition.

“It’s a robot donor, a machine donor,” says Freed’s partner, healthcare business expert Ron Mills.

The lungs are transplanted out of deceased patients into the cyborg chamber built to replicate what it’s like for lungs inside a healthy human body. There they can last for as long as 24 hours, about three times longer than current transplant technology permits.

If this multi-million dollar research project is successful — and there’s every indication from clinical trials that it will be — every single patient in Canada who is on the waiting list for new lungs and is in danger of dying could get the transplant they need.

The path to this achievement starts with Freed as a 12-year-old boy reading a news story about Baby Fae, the American infant born in 1984 with a diseased heart who underwent surgery, receiving the heart of a baboon. “I thought to myself, ‘I want to do that when I grow up,’ ” Freed says.

The main issue around transplant surgery for several decades now has been a lack of viable organs to transplant. Only three out of 1,000 deaths are in patients who are brain dead but with their other organs still working, the best case for preserving them for a short time for transplant. But out of those three in 1,000, just one in five will have lungs healthy enough to be used.

Freed and his partners, Mills and Dr. Jayan Nagendran, hope to double or triple the numbers for viable lung transplantation.

Freed built the first prototype in his car garage at home in 2015. He did all the coding for the machine, having taught himself the skill when he had to reprogram the computer in his 1988 Pontiac Fiero in order to turbocharge the engine.

Being a tinkerer, a musician, a coder and a surgeon provided Freed the skillset he needed to invent something new. “If you’re going to make a significant breakthrough, you have to be able to cross disciplines,” Freed says.

His stubbornness also helps, he adds:  “I’m too dumb to fail. I’m too dumb to figure out when something isn’t working and to give up.”

Freed first tested the device on pig lungs, then on 20 human lungs that had been rejected for transplant because they were too diseased. But on Freed’s machine, which allows for lungs to breathe naturally through pressure changes, these lungs actually improved in quality.

“The lung on the inside of the machine has to love it,” Freed says of the machine. “And the user on the outside of the machine has to love it as well, from all respects.”

The next step is to finish building a more user-friendly, portable and durable product, raise capital for Canada-wide testing, then take the prototype to a Montreal medical show and sell, sell, sell. Each unit will cost about $250,000, with a $12,000 disposable sterile kit for each transplant. In the United States, each transplant now costs $1 to $1.5 million. Mills hopes to have the product on the Canadian market next year and approved in the United States one year later.

In the end, of course, the success of the projects rests on how efficiently and economically the machine saves patients from certain death.

Indeed, death is a subject that came up regularly in my interview with Mills and Freed.

Death is front and centre in transplantation. A dead human is needed for lung donation and it’s  desperate and near-death patients who need them.

“We think about death a lot,” Mill said.

“It’s part of life,” Freed said. “This thing called life is a terminal disease. We’re all suffering from it. Think about that. It’s going to come to an end.”

“Jim Morrison said it,” Mills said, then quoted the American singer. “No one here gets out alive.”

What does Dr. Freed himself think about that fact?

“I’m dealing with it, right?” Freed said. “Make a contribution while we’re here. That’s how I think about it. Absolutely.”

On that count, I’d say Freed is most certainly hitting the mark.

VR app gives students a new way to see inner workings of cells

Virtual reality learning tools point to future of post-secondary education, says U of A cell biologist.

U of A cell biology professor Paul LaPointe helped create a virtual reality app that gives medical students a 3-D view of what’s happening inside cells and how drugs work. (Photo: Jordan Carson)

An educational app created at the University of Alberta is giving cell biology students a brand new perspective on their subject and may also offer a glimpse into the not-so-distant future of post-secondary education.

The Cell 101 VR App shows students a virtual reality perspective of the inner workings of cells and their interactions, allowing them to visualize cell biology in a way they never could before.

“Everything that cells do is because of their internal machinery, but how this machinery is constructed from proteins and other biomolecules is very hard for students to conceptualize,” said Paul LaPointe, an associate professor of cell biology at the U of A who helped create the application. 

He said it’s a difficult concept to illustrate and contextualize, making it difficult for students to understand that when a drug does something in the cell, it’s because it fits in the recesses of the proteins inside, preventing them from taking a mechanical action. 

“If you can tap into their imagination so that they start painting pictures for themselves and conceptualize what’s happening mechanically, then all the other stuff you have to teach becomes a little easier.”

The virtual reality (VR) app gives students a 3-D picture of the cell and what is happening inside. Instead of a static two-dimensional image seen in textbooks, learners can rotate the components to examine them from all angles, forming a complete picture in their minds.

The project began with a conversation between LaPointe and the Cognitive Projections (CogPro) group in the U of A’s Rehabilitation Robotics Laboratory. CogPro is an initiative to explore new ways of teaching health sciences through emerging technologies such as virtual reality. With LaPointe acting as the content matter expert, the team developed the app over about 18 months.

It uses Google Cardboard, a virtual reality platform developed by Google for use as a head mount for a smartphone. When users put on VR glasses, they can see four floating boards, each containing information and animations of elements of a cell structure, including lipids, bi-layer proteins and a large-scale image of the cell itself.

“I had 120 students standing in a lecture theatre looking at these things during a demonstration, and it worked amazingly well. They all loved it,” said LaPointe.

Future of education

The app is mostly being used as a teaching aid for individual students or small groups due to the limited number of VR sets. But LaPointe believes it’s only the tip of the iceberg of what could be done to enhance the education of learners—many of whom are increasingly moving away from traditional patterns of learning and embracing new technologies.

“I asked my students if there was anything else they’d like to see from the course incorporated into the app, and the list was everything. Between all the students who responded, it was every single thing we learned,” said LaPointe.

He believes the future of post-secondary education will involve chaptered apps eventually replacing expensive textbooks, saving learners hundreds of dollars for each textbook they don’t have to buy.

Another technology that shows great promise in transforming learning is augmented reality (AR)—superimposing a computer-generated image on a user’s view of the real world.

LaPointe believes AR could already be used to great effect in textbooks today, linking instructional videos to specific pages through a code that could be scanned through a smartphone.

While his work to date has focused on building the Cell 101 VR app, LaPointe hopes to also delve into AR in the near future. He is already taking steps through the U of A’s Heritage Youth Researcher Summer (HYRS) Program, working with high school summer students on augmented reality learning apps. The program is funded by Alberta Innovates with an aim to build expertise in Digital Health in Alberta. Together they are getting a head start on a new technological revolution LaPointe expects will soon transform post-secondary education.

“This is not going to go away. This is going to be done properly and everywhere very soon.”

U of A researchers detecting depression through voice

“I’m interested in applying and developing technology in the real world,” said Eleni Stroulia, professor of computing science at the University of Alberta. JOHN ULAN / SUPPLIED

Improved technology can now more accurately detect a depressed mood using just the sound of your voice, according to research by the University of Alberta.

“This is work that is important to us, that we are excited to do, that is rewarding and in the long term can be useful to humanity. But it’s too early, and we have to think about what’s the right way of making this happen,” said Eleni Stroulia, a professor in the department of computing science at the University of Alberta.

Doctoral student Mashrura Tasnim and Stroulia combined several algorithms, which are like recipes or “simple instructions to tell the computer what to do,” to recognize depression more accurately through voice cues, including volume, frequency and pitch, said Stroulia.

The goal is to develop a meaningful application from this technology that could help people understand depression. Approximately 11 per cent of Canadian men and 16 per cent of Canadian women will experience major depression in the course of their lives, according to the Government of Canada.

That could lead to the development of a mobile app, running on a user’s phone, that could collect voice samples, recognizing and tracking indicators of mood, much like step-counters that track physical movement, said Stroulia.

An app could be used by individuals tracking changes in their own moods over time, as a record for talking to a psychiatrist, or by caregivers. Data from voice cues could add more information to mental health assessments and treatment without relying on the patient to assess and record their emotional state.

“Yet another piece of information that hopefully helps diagnose things better,” said Stroulia.

But, there are legal, ethical and practical issues that would need to be addressed before an app helping to diagnose mental health issues would be useful. It could raise privacy concerns, so society has to think about how we deploy and manage these things, she said.

“This data cannot leak, but somebody has to access the data and make use of it. Absolute security guarantees uselessness,” said Stroulia. We need a legal and ethical framework to address data privacy issues, she added, especially since data is now part of almost all of our activities.

“Most disciplines collect data. We’ve had a technological advance that completely breaks open the types of data we can collect and the problems we could be addressing,” said Stroulia.

Robotic Arms: Amii and BLINC team up to improve quality of life after limb loss

Transformation starts with learning. That’s a slogan tied to the idea of innovation, and what better place to embody this idea than the Alberta Machine Intelligence Institute (Amii).

Founded in the Edmonton Metropolitan Region in 2002, one of Amii’s focuses is health care. They fund a plethora of health-related research, including the development of improved prosthetic limbs, the creation of new tools to diagnose tuberculosis, and the use of patient data to improve diabetes treatment.

Through these and other projects, the institute uses artificial intelligence (AI) and machine learning to improve the lives of people the world over.

Amii’s collaboration with the University of Alberta-based Bionic Limbs for Improved Natural Control (BLINC) Lab on the Adaptive Prosthetics Program uses real-time machine learning methods for assistive rehabilitation and intelligent artificial limbs.

Another centre of innovation, BLINC Lab brings together researchers and clinicians to advance prosthetic care through robotics and machine learning.

HANDi Hand, a creation from BLINC Lab and Amii

Working directly with patients and clinicians, the program developed technologies including the Bento Arm and the HANDi Hand — 2 types of intelligent artificial limbs that use machine learning to help patients regain their full range of motion and ability.

Asking patients to test the technology is critical so real-world tasks can be analyzed to provide relevant solutions outside of a lab.

The work of Amii and BLINC Lab are game-changers for people looking to regain human motion after experiencing limb loss. A forward-thinking and influential organization, they collaborate with other sectors to produce innovations in the health space and beyond.

Alberta company wins NASA tech competition with 3D ‘Star Trek’ surgery

WATCH ABOVE: Su-Ling Goh explains why some Edmonton inventors are attracting international attention lately.

An “immersive reality” product developed in Edmonton has skyrocketed all the way to NASA. Aris MD recently took top prize in the U.S. space agency’s iTech Ignite the Night pitch competition in Austin, Texas.

They were the only Canadian competitors.

“As Canadians, you go into it thinking, ‘Oh I’m so lucky to be here,’” said CEO Chandra Devam. “So it was really nice to be told: ‘You’re the best.’”

“And honestly, our stuff is the best.”

The Aris MD software turns conventional medical imaging — MRI, CT scan, x-ray — into 3D images. Physicians can wear smart goggles to view body structures from all angles and superimpose the images over a patient’s body on the operating table, like Star Trek surgery.

“It’s essentially giving a surgeon x-ray vision. So they can see through your skin where your organs are located.”

The technology can also flag potential tumours or abnormalities which the human eye can’t see.

NASA judges appreciated the possibility of autonomous medicine in space.

“If somebody’s in a space ship, maybe there are doctors present, maybe something happens to those doctors… what do you do then?” said Devam.

“The (artificial intelligence) being able to diagnose is extremely valuable.”

Devam and partner Scott Edgar were inspired by her near-death experience several years ago. During what was supposed to be a simple day surgery, a doctor accidentally cut an artery.

“(The medical team) didn’t know (the artery) was there. I started bleeding to death. They had to give me a blood transfusion to save my life.”

Devam hopes Aris MD could prevent that kind of mistake.

She and Edgar have now been invited to the Global Entrepreneurship Summit in the Netherlands in June.

They’re also presently in talks with several global companies.

Devam says it all seems so sci fi.

“What bigger rush is there to dream up the biggest dream you can, make it reality, and have that saving lives?”

U of A nanotech facility gets funding boost to ramp up to commercial scale

$3.4M investment will allow local company to mass-produce tiny medical devices that can monitor vital signs, help diagnose disease or even restore eyesight.

NanoFAB director Eric Flaim (left) shows Natural Resources Minister Amarjeet Sohi some of the devices nanotechnology companies can manufacture at the U of A facility, which received a total of $3.4 million in new funding today to expand its capacity to commercial scale. (Photo: Michael Brown)

Microscopically thin films of metal are essential to tiny biomedical devices placed in the human body to monitor vital signs, diagnose disease or even restore eyesight.

Every micron of the film has to be carefully placed on the device’s sensor, which means manufacturing on a commercial scale is a daunting challenge.

But thanks to a $3.4-million investment in the University of Alberta’s nanoFAB facility by the federal and provincial governments—as well as by Micralyne Inc., a local company that builds nano-devices for biomedical applications—commercializing nanotechnology research has become much easier.

The funding will be used to purchase a high-volume deposition tool used for applying thin films, along with contact aligners and etching tools—all of which will allow companies to manufacture devices quickly by the thousands.

“We’ve been moving wafers and chips along manually, but we were missing a way to move thousands along on a regular basis,” said Micralyne president and CEO Ian Roane at today’s funding announcement.

“These tools we’re buying allow us to really scale up,” he said, adding that his company plans to hire 50 or 60 more employees to meet expected demand once the tools are up and running.

“We have a commitment from a very large biomedical company that plans to repatriate production back to Canada,” added Roane. “This is a way of keeping our highly qualified professionals in this province.”

Some of the biomedical devices Micralyne manufactures include tiny sensors that measure blood pressure in the body or brain pressure in the skull, or restore eyesight using small needles implanted behind the eyes, said Roane.

The $1-million federal portion of the funding is part of Canada’s Innovation and Skills Plan aimed at creating new jobs for the global economy.

“Our investment in the University of Alberta will make nanoFAB a cornerstone of the regional innovation ecosystem,” said Amarjeet Sohi, federal minister of natural resources, who announced the funding on behalf of Western Economic Diversification Canada.

“It will help firms develop new micro- and nanotechnology while creating well-paying jobs and prosperity for Canada.”

The Alberta government will also contribute $1 million for the tools, and Micralyne will pitch in $1.4 million.

“Industry partners like Micralyne look for interesting research coming out of the U of A, hoping to scoop those up, add them to their manufacturing portfolio and sell them all around the world,” said Eric Flaim, director of nanoFAB, a national, open-access training, service and collaboration centre supporting research and development in nano sciences and engineering.

Flaim said nanoFAB currently has 17 staff members supporting more than 50 companies in the manufacture of products. In the short term, the new tools will help 10 Alberta companies bring five products to market, hire dozens of highly qualified personnel and generate $20 million in industry revenue.

“There comes a point in your research where being able to work on these kind of ideas becomes limited by the reliability, repeatability and uniformity of what you’re actually making,” he said.

“These tools will increase researchers’ ability to make more reliable devices, while creating a pathway for those products to be moved into industrial manufacturing.”

In addition to biomedical devices, micro- and nanotechnology developed at the U of A is used in a range of applications in life sciences, clean technology, digital technology and other sectors.

“U of A researchers are working at the forefront of micro- and nanotechnology,” said U of A president David Turpin. “With industry and government support like we’re seeing today, we are shaping a world-class innovation and manufacturing sector right here in Alberta.”

“Our government’s investment will help diversify Alberta’s micro- and nanotechnology sector … and support future innovation and long-term economic and social benefits for Alberta,” said Deron Bilous, Alberta minister of economic development and trade.

Artificial intelligence for mental health

The boundaries of artificial intelligence techniques are continually being advanced to improve our ability to interpret complex medical imaging results and diagnose diseases. And now, a new tool developed by University of Alberta researchers diagnoses schizophrenia from patient brain scans—a diagnosis that has historically relied on subjective data of patient experiences, rather than read from scans.

The study brought together campus expertise in two key areas: machine learning and psychiatry. Sunil Kalmady, lead author on the study and a postdoctoral fellow at the University of Alberta, explains the traditional difficulty in diagnosing the disease, and how machine learning was able to present a solution.

(From left) Computing scientists Russ Greiner and Sunil Kalmady worked with psychiatry professor Andrew Greenshaw to improve the accuracy of an AI-based diagnostic tool for schizophrenia. The program can predict with 87 per cent accuracy whether a patient has the illness by analyzing a brain scan. (Photo: Karim Ali)

“Schizophrenia is characterized by constellation of symptoms that might co-occur in patients. Two individuals with the same diagnosis might still present different symptoms,” said Kalmady. “This often leads to misdiagnosis. Machine learning, in this case, is able to drive an evidence-based approach that looks at thousands of features in a brain scan to lead to an optimal prediction.”

The result is EMPaSchiz (Ensemble algorithm with Multiple Parcellations for Schizophrenia prediction), a model that has been trained on scans from many patients diagnosed with schizophrenia.

Kalmady worked on the tool under the supervision of both Russ Greiner, professor in the Department of Computing Science, and Andrew Greenshaw, professor and associate chair in the Faculty of Medicine and Dentistry’sDepartment of Psychiatry.

“Machine learning provides a set of tools that can use that existing scan information to produce the classifier we want: a tool that, given a scan of a person’s brain, can predict whether that person has schizophrenia,” said Greiner. “Moreover, there are ways to estimate how accurate this tool will be and how often it will provide the correct diagnosis.”

When put to the test, EMPaSchiz was able to identify the disease in new scans with 87 per cent accuracy, outperforming existing AI models in identifying the disease.

EMPaSchiz is also one of the first machine learning tools trained exclusively on data from patients who aren’t yet undergoing drug treatment—which could make it more valuable in the early stages of diagnosing the illness.

Machine learning on the brain

Artificial intelligence and machine learning techniques hold particular promise for complex fields like that of mental health.

Some of the most reliable indicators the algorithm found are highlighted in this image, where red indicates elevated activity and blue indicates suppressed activity in the brain. Photo credit: Sunil Kalmady

“Mental health disorders are highly complex in terms of causes and manifestation of symptoms,” said Greenshaw. “Machine learning and future AI are approaches that enable a multi-dimensional, data-driven inroad that captures the level of complexity and objectivity that we need to unravel the wicked problems of understanding mental illness.”

EMPaSchiz also shows the potential of machine learning to develop accurate computing science tools to interpret a situation where there are many variables at work—a field in which Edmonton and the University of Alberta are leading players.

“Machine learning is a powerful and appropriate tool to use in situations like this,” explained Greiner. “Here, no one knows which combination of features will make the best predictor. And critically, there is already a database of training samples identified by skilled physicians. This has already lead to successful AI-driven systems by researchers around the world, including right here on campus and in our lab.”

The paper, “Towards artificial intelligence in mental health by improving schizophrenia prediction with multiple brain parcellation ensemble-learning,” was published in NPJ Schizophrenia (doi: 10.1038/s41537-018-0070-8).


Originally posted on: University of Alberta’s blog – Folio

Want to find out more about the artificial intelligence and machine learning research going on at the University of Alberta? Check out the Faculty of Science artificial intelligence hub to learn more.

5 fantastic health innovations from Edmonton in 2018

From lab-grown human tissue, to a simple case for life-saving pills, Edmonton brains came up with some pretty cool stuff in 2018.

One of my favourite parts of being the health reporter for Global Edmonton is the backstage pass to mind-blowing medical research in our city. Narrowing it all down to a few of the most innovative technologies or treatments every year is never easy, but here are my picks for 2018.

1. Donor lung assessment tool

In the past five years, the University of Alberta Hospital’s transplant team managed to increase the number of lung transplants from 30 per year to over 70. That’s thanks to their ex vivo perfusion technology, which allows them to store donor lungs outside of the body while they treat or repair the organs. It means previously unusable lungs could be transplanted — and the process became even more effective in 2018. Two doctors, Jayan Nagendran and Benjamin Adam, developed an assessment tool to measure the exact amount of injury in donor lungs and treat them accordingly, optimizing the organs’ health for their new person.

2. Sensation in missing limbs

We’ve seen some incredibly sophisticated bionic limbs over the years, but one thing they’re all missing is feeling. Dr. Jackie Hebert’s team partnered with researchers from the Cleveland Clinic to figure out a way for a person with no hand to sense complex hand movements. That involved surgically rewiring the nerves of six patients and using tiny robots to vibrate the muscles. It resulted in patients using their prosthetic limbs up to 300 times better than before — because they didn’t have to watch every move.

3. Lab-grown nose cartilage

Right now, if you needed your nose reconstructed, surgeons would move cartilage from your ears or ribs to the middle of your face. That would require several surgeries and a potentially long recovery period. But three Edmonton doctors — Adetola Adesida, Kal Ansari and Martin Osswald — developed a way to grow nasal cartilage in a lab. They take a tiny biopsy and seed the cells on a collagen scaffold, sprouting an unlimited supply of cartilage from the patients’ own cells.

4. Prostate cancer blood test

Dr. John Lewis’ U of A team has been developing Clarity DX for years. In 2018, it finally progressed to widespread testing across Alberta. The simple blood test is 40 per cent more accurate than the standard PSA test, and can predict which men need a biopsy and which can safely avoid the painful procedure.

5. SMHeartCard

Aspirin and nitroglycerin are standard emergency medications for patients with heart attack symptoms, yet a U of A study shows only 20 per cent of heart disease patients carry nitroglycerin. Part of the problem is that it often comes in a glass bottle. That’s why doctors John Mackey and Neal Davies, along with retired engineer James Stewart, invented the SMHeartCard, a compact plastic container that protects the pills and fits easily into a wallet, purse or pocket.

Thank you for your submission!

Contact Us








Subscribe to our newsletter! 

Questions? Get in touch
info@edmontonglobal.ca : 1-800-264-4952
Suite 100 - 10020 100 St.NW -T5J 0N5 Edmonton, AB
© Edmonton Global. All rights reserved