The ambitious project by Dr Stephen Thaler to have his AI system named as an inventor in patent applications filed around the world has largely been a failure. Most jurisdictions, including the UK, EPO, USA, Australia, and New Zealand, have rejected attempts to have the AI referred to as DABUS (Device for Autonomous Bootstrapping of Unified Sentience) named as an inventor on a patent. A summary of the latest news on Dr Thaler’s cases is neatly summarised here.

For the most part the applications were rejected on the grounds that applications themselves were insufficient since they failed to properly name an inventor. Even without an explicit requirement in patent law, it’s clear from the decisions that under current legislation, only a natural person, not a machine, may be named as an inventor.

Since the efforts by Dr Thaler in filing the application with DABUS named as the inventor were deliberately intended to test the bounds of inventorship, for the most part the issues under consideration were confined to that question alone. In the appeal by the Australian Commissioner of Patents to the Full Court of the Federal Court (Commissioner of Patents v Thaler [2022] FCAFC 62), it was agreed fact between the parties that Dr Thaler was not an inventor. the justices seemed to lament that the broader issues around identifying inventorship and establishing patentability were not at issue [para 121]:

 Secondly, we do not accept the premise of the proposition, accepted by the primary judge and apparently influential in his reasoning, that if DABUS is not accepted to be an inventor, no invention devised by an artificial intelligence system is capable of being granted a patent. In the present case, it was said to be an agreed fact that DABUS is the inventor of the invention the subject of the application and that Dr Thaler is not. However, the characterisation of a person as an inventor is a question of law. The question of whether the application the subject of this appeal has a human inventor has not been explored in this litigation and remains undecided. Had this question been explored, it may have been necessary to consider what significance should be attributed to various matters including the (agreed) facts that Dr Thaler is the owner of the copyright in the DABUS source code and the computer on which DABUS operates, and that he is also responsible for the maintenance and running costs. 

The judge in the appeal to the Court of Appeal Federal Circuit (Thaler v. Vidal) made a similar observation [page 10}:

Moreover, we are not confronted today with the question of whether inventions made by human beings with the assistance of AI are eligible for patent protection.

The leaves many questions unanswered around the nature of inventorship when AI or indeed any computer software is used to develop an invention. Although the term AI has more recently become popular, the use of computer software to assist in the development of technology has long been essential in many fields. For example, computational chemistry is often used in drug development, and finite state analysis is commonplace in many fields of engineering.

While computer analysis is often used as part of the development process, it is not clear where the line would be drawn between the contribution by software and human ingenuity. Dr Thaler was adamant that the process performed by DABUS was entirely ‘autonomous’ and the creation of the two inventions did not require his input. The practical reality though would not be a situation in which DABUS spontaneously provided an invention, but where Dr Thaler after developing the software and training the system on data, instigated the development process by requiring DABUS to analyse a problem and provide a solution.

In a more commonplace situation, a chemist may use software to determine a chemical structure that is likely to interact with a drug target, such as the active site of a biological receptor. If the chemical structure is suitable, then can it be said that the chemist is the inventor? The chemist did not develop the software, identify the biological receptor or conceive of the chemical structure. That isn’t to say that there is no invention, since inventiveness is judged against the prior art rather than the manner in which the invention was conceived. The implication of the DABUS decisions is that although there is an invention, it possibly could not be patented unless there was additional inventive contribution from a person.

The Australian Full Court surmised further issues on recognising an AI as an inventor [para 119]:

In our view, there are many propositions that arise for consideration in the context of artificial intelligence and inventions. They include whether, as a matter of policy, a person who is an inventor should be redefined to include an artificial intelligence. If so, to whom should a patent be granted in respect of its output? The options include one or more of: the owner of the machine upon which the artificial intelligence software runs, the developer of the artificial intelligence software, the owner of the copyright in its source code, the person who inputs the data used by the artificial intelligence to develop its output, and no doubt others. If an artificial intelligence is capable of being recognised as an inventor, should the standard of inventive step be recalibrated such that it is no longer judged by reference to the knowledge and thought processes of the hypothetical uninventive skilled worker in the field? If so, how? What continuing role might the ground of revocation for false suggestion or misrepresentation have, in circumstances where the inventor is a machine?

As it currently stands, there is no clear direction on whether the output solely produced by an AI is inventive or a routine or expected result. In the Australian context, an AI could be a member of the notional team of skilled addresses that would consider the problem. However, as part of that team the AI could only be expected to act as a skilled but unimaginative worker. If the courts decided that an AI is inherently not inventive, then any solution that could be provided by an AI must be routine and non-inventive.

These are issues not likely to be answered soon, since it is not often that an applicant voluntarily submits their patent applications to such scrutiny. We will need to wait for a comprehensive invalidation case for the impact of AI to be tested. Changes in legislation will no doubt follow.

Tags