To patent or not to patent?

As universities and research institutions look to protect the knowledge they develop, András Havasi questions time frames, limited resources, and associated risks.
András Havasi

The last decade has seen the number of patent applications worldwide grow exponentially. Today’s innovation- and knowledge-driven economy certainly has a role to play in this. 

With over 21,000 European and around 8,000 US patent applications in 2018, the fields of medical technologies and pharmaceuticals—healthcare industries—are leading the pack. 

Why do we need all these patents?  

A patent grants its owner the right to exclude others from making, using, selling, and importing an invention for a limited time period of 20 years. What this means is market exclusivity should the invention be commercialised within this period. If the product sells, the owner will benefit financially. The moral of the story? A patent is but one early piece of the puzzle in a much longer, more arduous journey towards success.

Following a patent application, an invention usually needs years of development for it to reach its final product stage. And there are many ‘ifs’ and ‘buts’ along the way to launching a product in a market; only at this point can a patent finally start delivering the financial benefits of exclusivity. 

Product development is a race against time. The longer the development phase, the shorter the effective market exclusivity a product will have, leaving less time to make a return on the development and protection costs. If this remaining time is not long enough, and the overall balance stays in the negative, the invention could turn into a financial failure.

Some industries are more challenging than others. The IT sector is infamous for its blink-and-you-miss-it evolution. The average product life cycle on software has been reduced from three–five years to six–12 months. However, more traditional sectors cannot move that quickly.

The health sector is one example. Research, development, and regulatory approval takes much longer, spanning an average of 12–13 years from a drug’s inception to it being released on the market, leaving only seven to eight years for commercial exploitation.

So the real value of a patent is the effective length of market exclusivity, factored in with the size of the market potential. Can exclusivity in the market give a stronger position and increase profits to make a sufficient return on investment? All this makes patenting risky, irrespective of the technological content—it is a business decision first and foremost.

Companies see the opportunity in this investment and are happy to take the associated risks. But why does a university bother with patents at all and what are its aims in this ‘game’?

Universities are hubs of knowledge creation and today’s economy sees the value in that. As a result, research institutions intend to use and commercialise their know-how. And patenting is an essential part of that journey.

The ultimate goal and value of a patent remains the same, however, it serves a different purpose for universities. Patents enable them to legally protect their rights to inventions they helped nurture and claim financial compensation if the invention is lucrative. At the same time, patent protection allows the researchers to freely publish their results without jeopardising the commercial exploitation of the invention. It’s a win-win situation. Researchers can advance their careers, while the university can do its best to exploit the output of their work, bolster its social impact, and eventually reinvest the benefits into its core activity: research. 

At what price?

Patenting may start at a few hundred or thousand euros, but the costs can easily accumulate to tens or even hundreds of thousands over the years. However, this investment carries more risk for universities than for companies.

Risks have two main sources. Firstly, universities’ financial capabilities are usually more limited when compared to those of businesses. Secondly, universities are not the direct sellers of the invention’s eventual final product. For that, they need to find their commercial counterpart, a company that sees the invention’s value and commercial potential. 

This partner needs to be someone who is ready to invest in the product’s development. This is the technology transfer process, where the invention leaves the university and enters the industry. This is the greatest challenge for university inventions. Again, here the issue of time raises its head. The process of finding suitable commercial partners further shortens the effective period of market exclusivity.

A unique strategy is clearly needed here. Time and cost are top priorities. All potential inventions deserve a chance, but risks and potential losses need to be minimised. It is the knowledge transfer office’s duty to manage this. 

We minimise risks and losses by finding (or trying to find) the sweet spot of time frames with a commercial partner, all while balancing commercial potential and realistic expectations. The answer boils down to: do we have enough time to take this to market and can we justify the cost?

Using cost-optimised patenting strategy, we can postpone the first big jump in the costs to two and a half years. After this point, the costs start increasing significantly. The rule of thumb is that about five years into a patent’s lifetime the likelihood of licensing drops to a minimum. So on a practical level, a university invention needs to be commercialised very quickly. 

Maintaining a patent beyond these initial years can become unfeasible, because even the most excellent research doesn’t justify the high patenting costs if the product is not wanted by industry. And the same applies for all inventions. Even in the health sector, despite product development cycles being longer, if a product isn’t picked up patents can be a huge waste of money.

Patenting is a critical tool for research commercialisation. And universities should protect inventions and find the resources to file patent applications. However, the opportunities’ limited lifetime cannot be ignored. A university cannot fall into the trap of turning an interesting opportunity into a black hole of slowly expiring hopes. It must be diligent and level-headed, always keeping an ear on the ground for the golden goose that will make it all worth it. 

The unusual suspects

When it comes to technology’s advances, it has always been said that creative tasks will remain out of their reach. Jasper Schellekens writes about one team’s efforts to build a game that proves that notion wrong.

The murder mystery plot is a classic in video games; take Grim Fandango, L.A. Noire, and the epic Witcher III. But as fun as they are, they do have a downside to them—they don’t often offer much replayability. Once you find out the butler did it, there isn’t much point in playing again. However, a team of academics and game designers are joining forces to pair open data with computer generated content to create a game that gives players a new mystery to solve every time they play. 

Dr Antonios Liapis

The University of Malta’s Dr Antonios Liapis and New York University’s Michael Cerny Green, Gabriella A. B. Barros, and Julian Togelius want to break new ground by using artificial intelligence (AI) for content creation. 

They’re handing the design job over to an algorithm. The result is a game in which all characters, places, and items are generated using open data, making every play session, every murder mystery, unique. That game is DATA Agent.

Gameplay vs Technical Innovation 

AI often only enters the conversation in the form of expletives, when people play games such as FIFA and players on their virtual team don’t make the right turn, or when there is a glitch in a first-person shooter like Call of Duty. But the potential applications of AI in games are far greater than merely making objects and characters move through the game world realistically. AI can also be used to create unique content—they can be creative.

While creating content this way is nothing new, the focus on using AI has typically been purely algorithmic, with content being generated through computational procedures. No Man’s Sky, a space exploration game that took the world (and crowdfunding platforms) by storm in 2015, generated a lot of hype around its use of computational procedures to create varied and different content for each player. The makers of No Man’s Sky promised their players galaxies to explore, but enthusiasm waned in part due to the monotonous game play. DATA Agent learnt from this example. The game instead taps into existing information available online from Wikipedia, Wikimedia Commons, and Google Street View and uses that to create a whole new experience.

Data: the Robot’s Muse  

A human designer draws on their experiences for inspiration. But what are experiences if not subjectively recorded data on the unreliable wetware that is the human brain? Similarly, a large quantity of freely available data can be used as a stand-in for human experience to ‘inspire’ a game’s creation. 

According to a report by UK non-profit Nesta, machines will struggle with creative tasks. But researchers in creative computing want AI to create as well as humans can.

However, before we grab our pitchforks and run AI out of town, it must be said that games using online data sources are often rather unplayable. Creating content from unrefined data can lead to absurd and offensive gameplay situations. Angelina, a game-making AI created by Mike Cook at Falmouth University created A Rogue Dream. This game uses Google Autocomplete functions to name the player’s abilities, enemies, and healing items based on an initial prompt by the player. Problems occasionally arose as nationalities and gender became linked to racial slurs and dangerous stereotypes. Apparently there are awful people influencing autocomplete results on the internet. 

DATA Agent uses backstory to mitigate problems arising from absurd results. A revised user interface also makes playing the game more intuitive and less like poring over musty old data sheets. 

So what is it really? 

In DATA Agent, you are a detective tasked with finding a time-traveling murderer now masquerading as a historical figure. DATA Agent creates a murder victim based on a person’s name and builds the victim’s character and story using data from their Wikipedia article.

This makes the backstory a central aspect to the game. It is carefully crafted to explain the context of the links between the entities found by the algorithm. Firstly, it serves to explain expected inconsistencies. Some characters’ lives did not historically overlap, but they are still grouped together as characters in the game. It also clarifies that the murderer is not a real person but rather a nefarious doppelganger. After all, it would be a bit absurd to have Albert Einstein be a witness to Attila the Hun’s murder. Also, casting a beloved figure as a killer could influence the game’s enjoyment and start riots. Not to mention that some of the people on Wikipedia are still alive, and no university could afford the inevitable avalanche of legal battles.

Rather than increase the algorithm’s complexity to identify all backstory problems, the game instead makes the issues part of the narrative. In the game’s universe, criminals travel back in time to murder famous people. This murder shatters the existing timeline, causing temporal inconsistencies: that’s why Einstein and Attila the Hun can exist simultaneously. An agent of DATA is sent back in time to find the killer, but time travel scrambles the information they receive, and they can only provide the player with the suspect’s details. The player then needs to gather intel and clues from other non-player characters, objects, and locations to try and identify the culprit, now masquerading as one of the suspects. The murderer, who, like the DATA Agent, is from an alternate timeline, also has incomplete information about the person they are impersonating and will need to improvise answers. If the player catches the suspect in a lie, they can identify the murderous, time-traveling doppelganger and solve the mystery!

De-mystifying the Mystery 

The murder mystery starts where murder mysteries always do, with a murder. And that starts with identifying the victim. The victim’s name becomes the seed for the rest of the characters, places, and items. Suspects are chosen based on their links to the victim and must always share a common characteristic. For example, Britney Spears and Diana Ross are both classified as ‘singer’ in the data used. The algorithm searches for people with links to the victim and turns them into suspects. 

But a good murder-mystery needs more than just suspects and a victim. As Sherlock Holmes says, a good investigation is ‘founded upon the observation of trifles.’ So the story must also have locations to explore, objects to investigate for clues, and people to interrogate. These are the game’s ‘trifles’ and that’s why the algorithm also searches for related articles for each suspect. The related articles about places are converted into locations in the game, and the related articles about people are converted into NPCs. Everything else is made into game items.

The Case of Britney Spears 

This results in games like “The Case of Britney Spears” with Aretha Franklin, Diana Ross, and Taylor Hicks as the suspects. In the case of Britney Spears, the player could interact with NPCs such as Whitney Houston, Jamie Lynn Spears, and Katy Perry. They could also travel from McComb in Mississippi to New York City. As they work their way through the game, they would uncover that the evil time-traveling doppelganger had taken the place of the greatest diva of them all: Diana Ross.

Oops, I learned it again 

DATA Agent goes beyond refining the technical aspects of organising data and gameplay. In the age where so much freely available information is ignored because it is presented in an inaccessible or boring format, data games could be game-changing (pun intended). 

In 1985, Broderbund released their game Where in the World is Carmen Sandiego?, where the player tracked criminal henchmen and eventually mastermind Carmen Sandiego herself by following geographical trivia clues. It was a surprise hit, becoming Broderbund’s third best-selling Commodore game as of late 1987. It had tapped into an unanticipated market, becoming an educational staple in many North American schools. 

Facts may have lost some of their lustre since the rise of fake news, but games like Where in the World is Carmen Sandiego? are proof that learning doesn’t have to be boring. And this is where products such as DATA Agent could thrive. After all, the game uses real data and actual facts about the victims and suspects. The player’s main goal is to catch the doppelganger’s mistake in their recounting of facts, requiring careful attention. The kind of attention you may not have when reading a textbook. This type of increased engagement with material has been linked to improving information retention.In the end, when you’ve traveled through the game’s various locations, found a number of items related to the murder victim, and uncovered the time-travelling murderer, you’ll hardy be aware that you’ve been taught.

‘Education never ends, Watson. It is a series of lessons, with the greatest for the last.’ – Sir Arthur Conan Doyle, His Last Bow. 

The rise of the academic entrepreneur

What is it that separates innovation in the lab from successful multi-million euro ventures that make money and have a positive impact on the world? The Knowledge Transfer Office’s Andras Havasi writes.

Continue reading

The sky’s role in archaeology

In 1994, Czech poet-president Vaclav Havel wrote an article discussing the role of science in helping people understand the world around them. He also noted that in this advance of knowledge, however, something was left behind. ‘We may know immeasurably more about the universe than our ancestors did, and yet it increasingly seems they knew something more essential about it than we do, something that escapes us.’ Almost all traditional cultures looked to the sky for guidance. Cosmology is what gave our ancestors their fundamental sense of where they came from, who they were, and what their role in life was. While arguably incorrect, these ideas created codes of behaviour and bestowed a sense of identity. The cosmology of European prehistoric societies has been studied independently by archaeologists and archaeoastronomers (an interdisciplinary field between archaeology and astronomy). Despite their shared goal of shedding light on our past lives, thoughts, and ideas, the two fields have often failed to merge, mainly due to different approaches. A clear local case is the question of the Maltese megalithic temples.

Tore Lomsdalen

The Mnajdra South Temple on Malta predates both Stonehenge and the Egyptian pyramids. It is the oldest known site in the world that qualifies as a Neolithic device constructed to cover the path of the rising of the sun throughout a whole year. What is unfortunate is that, so far, archaeologists and archaeoastronomers have studied the site largely in isolation.

Whether the temples were built to visualise the effects of the rising sun as seen today is an open question. But with such specific and repetitive patterning, one cannot deny that the sky was an important element in the builders’ understanding of the world—their cosmology.

With some exceptions, archaeologists have largely ignored, excluded, or underrated the importance of the sky in the cultural interpretation of the material record. When studying ancient communities, chronological dating and economic concerns are often given precedence over the immaterial.

But the fault does not lie solely with disinterested archaeologists. Archaeoastronomy has often been too concerned with collecting astronomical and orientation data, neglecting the wider archaeological record, and ignoring the human element in cosmology.

We need to find a common ground. Both sides need to open themselves up to different professional perspectives and convictions and embrace alternative interpretations and possibilities. Bridging the gap between archaeology and archaeoastronomy will allow us to paint a detailed picture of past societies. And maybe it will shed light on that lost knowledge about the universe and our place in it.


Lomsdalen and Prof. Nicholas Vella are organising an afternoon workshop on Skyscape Archaeology as well as an open symposium on Cosmology in Archaeology. For more information, visit: um.edu.mt/arts/ classics-archaeo/newsandevents

  Author: Tore Lomsdalen

A scientist and a linguist board a helicopter…

Amanda Mathieson

A scientist and a linguist board a helicopter, and the scientist says to the linguist, ‘What is the cornerstone of civilisation, science or language?’ It might sound like the opening line of a joke, but it’s actually from the opening sequence of the film Arrival (2016). In the film, aliens have landed on our doorstep, and our scientist and linguist have been chosen as suitable emissaries to establish contact. The scientist, perhaps wishing to size up his new colleague, then poses the question. Whose field has been more important to the advancement of the human race? Science or language? 

In reality, they are both wrong (or both half-right). It is true that language was necessary for us to organise as a species, forming complex networks of cooperation over vast distances and time. Without specialising our efforts and collaborating, we could not have built our great structures, supported large communities, or migrated over all continents. Yet, without science, without improving our understanding of the natural world, we would still be at its mercy. 

Science is the tool we use to change circumstance. When populations are dying from an infectious disease, we create a vaccine. When we’re unable to grow enough food to support ourselves, we develop a better strain of crop. When we struggle to transport materials over great distances, we create machines that will do it for us. Science is our secret weapon, transforming problems into possibilities. However, science alone means little. If innovation dies with its creator, who does it help? Science must be communicated to others before it can make a difference in any meaningful way. 

It would be incomplete to bestow language or science with the title of ‘the cornerstone of civilisation.’ It was science communication that really drove our development. And I don’t just mean this in the external sense. After all, is the transfer of genetic information from one generation to the next not science communication? What are we but a biological game of Chinese Whispers, the message mutating through each host but somehow continuing to make sense over millions of years? 

The human race not only benefits greatly from science communication; we are the product of it. It is embedded into our biological and cultural history. Proof that it is not just knowledge but the sharing of knowledge that is the real root of power. 

Hopefully the aliens agree.   

Author: Amanda Mathieson

The new digital divide

Unequal access to technology and the Internet is traditionally termed the ‘digital divide’. Both are expensive, which leaves some people behind. Today the situation has changed, with 98% of minors having home Internet access in Malta. Government targets digitally-deprived students by investing hefty sums to have tech at school. However, there is a new digital divide within formal education, and this time it is not about who uses technology but how they use it. 

What is the attitude toward the technology that is being used in class? What is the goal that students are using that technology to accomplish? You can have countless schools on the receiving end of whole shipments of tablets and laptops, but the sad reality is that without an effective strategy, they are unlikely to reap the full benefits of that investment. 

Dr Philip Bonanno

If technology is placed within a system that ignores students’ needs and is unresponsive, if not completely resistant, to new teaching and learning methods, the result is completely counterproductive. A teacher’s frustration with students being distracted by their devices is an everyday occurrence, and it needs to be addressed. The question is: Is this a technology-related problem or a more profound issue related to how humans discover and understand knowledge? Are these pedagogical conflicts arising from the presence of technology in class or from an epistemological clash between teachers’ and students’ beliefs about learning and knowledge sharing? 

If we define pedagogy as ‘guidance for learning’, we need to provide guidance for a variety of learning methods. By focusing only on the ‘chalk-and-talk’ method of teaching delivery, we may actually limit access to different ways of acquiring knowledge. Besides using technology to enhance teaching, digital tools and resources need to be used to empower students: first to take over the management of their own learning, and second, to pursue different technology-enhanced learning avenues for acquiring, creating, and sharing knowledge. This gives the student better skills in digital and information (critical) literacy, in collaboration, and in networking, hence preparing them for the world of work. 

To make this happen, challenges await both teachers and students. Teachers need to welcome new forms of learning, offering guidance and support rather than simply ‘giving students all the information they need to know.’ Students, on the other hand, have to overcome the mental conditioning that links learning directly to teaching so they can stand on their own two feet. 

Students and teachers need to work together to adopt a more independent and customised approach to learning, enhanced and transformed through technology.  

Author: Dr Philip Bonanno