As part of the India AI Impact Summit, Galgotias University was removed from the event after presenting a commercially available Chinese robotic dog as their own innovation. The university initially claimed the robot, identified by internet users as the Unitree Go2, was a product of their Centre of Excellence. Following an apology for the “confusion” and an explanation that their representative was “ill-informed,” the university’s participation status remained unclear. This incident highlights the pressure on India to demonstrate genuine local innovation as it aims to become a global AI hub.
Read the original article here
A rather unusual situation has emerged from the world of artificial intelligence, where a robotic dog, seemingly purchased off the shelf, has led to an Indian university being unceremoniously ejected from a prestigious AI summit. This incident brings to mind past experiences with academic competitions, where the line between genuine student innovation and external assistance often blurred, sometimes to a comical degree. There was a time, many years ago, when judging a university engineering competition, it was understood that a certain amount of “help” from teaching assistants or technicians might occur. While not ideal, the protocol was often to quiz the students; if they could articulate the concepts, the credit was, perhaps begrudgingly, given.
This brings us to a particular year when an Indian university presented what they called an “infrared fingertip detection thingy” and a “floating keyboard.” To the judges, many of whom lacked a computer science background and were instead from civil or mechanical engineering, this technology was astounding. They were thoroughly impressed, believing the students had conceived and built these marvels entirely themselves. The students, it turned out, had been less than forthcoming. I, with my computer background, recognized the sophistication and likely external origin of these components. I scored them zero, while the other judges, awestruck, awarded them scores in the high nineties. This led to them securing third place, a result that felt entirely unearned.
The ensuing fallout was swift and, frankly, predictable. The university team expressed outrage, accusing me of racism and favoritism. It was a curious accusation, especially since the team from my own university had performed respectably, placing around eighth. Backstage, I confronted the professors, seeking clarification. When asked if the students had built the infrared sensor, the answer was no. The “floating keyboard,” which was essentially a projector, also wasn’t their creation. And crucially, regarding the fingertip recognition software, the very core of my expertise, the response was again, no.
My response to the other judges was direct: “I wasn’t aware this was an assembly competition. Perhaps we should announce that this is the new format.” My zero score was upheld, though the university still received an undeserved third place. My combative approach, perceived as lacking an “educational spirit,” led to my exclusion from judging the following year. The irony was not lost on me when I later learned who had won that subsequent competition.
This brings us to the current situation with the robotic dog. It’s perplexing that a readily available, commercially produced robot, like the Unitree Go2, would be presented as the pinnacle of a university’s AI research at a global summit. While showcasing research utilizing off-the-shelf hardware is a legitimate and active area in embodied AI, the crucial element is full disclosure. The AI behaviors and the programming developed by the university should have been the focus, not the robot itself, which is akin to presenting a powerful computer and claiming the operating system as your own groundbreaking invention. The fact that the internet now knows more about the Unitree Go2 than many attendees expected is a testament to this misstep.
There’s a broader pattern that seems to emerge from some of these experiences. I recall attempting to hire a junior-level software engineer in New York, only to be told that hiring three individuals in India would be more cost-effective. These candidates, often possessing PhDs and extensive experience on paper, struggled with fundamental concepts like basic flowcharts. This suggests a disconnect, perhaps within the educational or hiring infrastructure, where credentials don’t always translate to practical capability.
More directly relevant to the robo-dog incident, it appears the university did indeed purchase robotic hardware, including a robo-dog and a soccer-drone, for presentation. The intention might have been to demonstrate research and development *around* these platforms, which is a valid pursuit. However, a faculty member, perhaps not deeply immersed in the technical AI aspects and hailing from a communications department, went to the media and made claims of complete in-house, end-to-end development. These claims, once disseminated online, quickly unraveled as inaccuracies. The subsequent government intervention, requesting the university’s departure from the summit premises, highlights the significant embarrassment caused. Attempts to backtrack, stating the university was using the technology for further research, did little to mitigate the damage, leading to considerable online criticism.
The very idea of presenting a commercially acquired robot dog at a global AI summit as a primary achievement is fundamentally flawed. It’s a peculiar oversight, especially considering the vast number of talented programmers and AI researchers in India. One can only imagine the internal discussions or the pressure to present something “cutting-edge” that led to this particular decision. The subsequent online backlash, largely from within India itself, further amplifies the strangeness of the situation. It certainly underscores the importance of transparency and authenticity in academic and technological presentations. The incident serves as a stark reminder that while technological progress is exciting, integrity in its representation is paramount.
