Two UCT academics clinch coveted Google research scholarships

Associate professor Amir Patel and Dr Moholola Tsoeu have been admitted to Google’s research scholar programme. They are the only Africans in the programme's 2021 cohort.
Associate professor Amir Patel and Dr Moholola Tsoeu have been admitted to Google’s research scholar programme. They are the only Africans in the programme's 2021 cohort.
Image: Je’nine May/UCT

Two University of Cape Town (UCT) academics have become the institution’s first recipients of Google's research scholar programme.

Associate professor Amir Patel and Dr Mohohlo Tsoeu — from UCT’s electrical engineering department and its new African Robotics Unit — are also the only Africans in the programme’s 2021 cohort.

The programme supports early-career researchers working in fields relevant to the search engine giant, providing “unrestricted gifts to support research at institutions and focused on funding world-class research”.

The recipients must be in computer science-related fields to be eligible for the award.

Patel has been awarded a scholarship in the “machine perception” category and Tsoeu in the “natural language processing” category.

“It feels amazing to be recognised by Google, one of the largest tech companies in the world. It is also exciting and encouraging for us to be mentioned among some of the world’s top universities in the field of robotics and artificial intelligence,” said Patel.

Tsoeu said: “It is a positive affirmation that our research is important, has the potential to have great impact and that we have the intellectual capital to deliver.”

The research project that earned Patel's award aims to provide “deeper insight into the abilities of the world’s greatest animal athletes (located in Africa), such as the cheetah”.

It looks into how they can “robustly traverse through the unstructured world” and says its findings will prove “vital for legged robots if they are ever to leave the confines of the laboratory”.

The biggest obstruction to a universal understanding of animal movement is measuring and modelling whole-body motion in the wild, said Patel. He said his project proposes a deep learning-based motion capture system, WildPose, which “leverages complementary sensor data to remotely obtain high-speed, whole-body 3D animal kinematics in the field from a single view”.

According to Patel, the WildPose system will enable videographers to capture biomechanical data from animals, such as cheetahs and lions, in the wild using a single hand-held device, creating a new source for data collection.

“This research is important as it will allow us to measure the motion of animals in the wild at an unprecedented level,” said Patel. “I believe this award will help me further my goal of moving biomechanics beyond the confines of the laboratory.”

Tsoeu's winning project aims to contribute to the “development of comprehensive, high-quality language corpora for indigenous SA languages”.

It will also investigate and develop novel and high-performance machine-learning algorithms aimed at application areas such as automatic speech recognition, translation, and text-to-speech/sign technology, he said.

He said these applications are in the growing area of human-machine interfacing “but more importantly in the SA context, they contribute towards bridging the human language divide and improve equal access and participation to restore the dignity of currently marginalised groups such as the deaf and hard of hearing communities”.

His research contributes to “bridging the human-to-human language divide that devastates SA, leading to political language debates at universities and other spaces, and marginalisation of native speakers of native languages, especially the deaf community”.

Said Tsoeu: “The world is getting extremely connected, both through travel and the web, and the language divide remains a bottleneck towards enjoying full global connectedness.” 

TimesLIVE

Would you like to comment on this article or view other readers' comments? Register (it’s quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.

X