Fri 1st Dec 2023

Patentability of AI at the UKIPO following Emotional Perception

Service: Patents

Sectors: AI and data science

Tom Woodhouse explains the implications of the UK High Court's decision in the appeal of Emotional Perception AI Ltd vs Comptroller-General of Patents, Designs and Trade Marks (2023).

Emotional Perception AI Ltd vs Comptroller-General of Patents, Designs and Trade Marks (2023) concerns an appeal to the High Court of the decision of the Hearing Officer (BL/O/542/22), which rejected their patent application as a “program for a computer…as such” excluded under Section 1(2)(c) of the Patent Act 1977.  However, this has been overturned by the High Court on appeal, with far reaching consequences for applicants seeking protection for their artificial intelligence (AI) inventions in the UK.

 

The patent application relates to the use of artificial neural networks (ANNs) for file recommendation.  The invention is concerned with training an ANN to perceive semantic similarity or dissimilarity between media files and using the trained ANN to recommend a file which is semantically similar to a given input. The files may be audio files, video files, static image files, or text files, with the application emphasising music.  According to the Appellant "[i]n these pairwise comparisons the distance in property space between the output (property) vectors of the ANN is converged to reflect the differences in semantic space between the semantic vectors of each pair of files. The result is that in the trained ANN, files clustered close together in property space will in fact have similar semantic characteristics, and those far apart in property space will have dissimilar semantic characteristics.”

 

The judgment has wide-ranging consequences for inventions concerning:

a. Neural networks;

b. Model training;

c. 'Subjective’ or ‘cognitive’ benefits.

 

Notably, the judgement is confined in substance to the Section 1(2)(c) exclusion on “programs for computers…as such” (para. 2).  The European Patent Office currently assesses technical effect of machine learning inventions under the “mathematical methods…as such” exclusion of Art. 52(2)(a) EPC, corresponding to Section 1(2)(a) of the UK Patents Act 1977.  Curiously, the UKIPO Hearing Officer did not invoke the “mathematical methods” exclusion in the decision under appeal, and the Court declined to consider it on procedural grounds (para. 82). 

 

The UKIPO has suspended its recent guidelines for AI inventions as of 23 November 2023 [1], and has issued temporary guidance in light of Emotional Perception [2], which omits any mentions of the Section1(2)(a) exclusion on “mathematical methods…as such”.  Therefore, there is currently an open question as to how the mathematical methods exclusion will be applied by the UKIPO and the UK Courts in going forward.

 

a. Neural networks

Para. 56 of the Emotional Perception judgement contains the headline-grabbing conclusion that an ANN implemented in software is not a program for a computer, but rather a “software emulation” of a hardware ANN that does not even invoke the exclusion of Section 1(2)(c) on “programs for computers…as such”.  This conclusion appears to rest on an earlier finding that the “inner workings” of a hardware neural network are not a “program for a computer”, a point that was apparently not disputed by the UKIPO (para. 41).

 

This can be contrasted with the approach adopted by the Court of Appeal in Gale's Application [1991] R.P.C. 305 (not referred to in the Emotional Perception judgement):

 

“if Mr. Gale's discovery or method or program were embodied in a floppy disc (software) neither the disc nor a computer into whose RAM that programs had been inserted could be patented, it must, in my view, follows [sic] that the silicon chip with its circuitry embodying the program (hardware) cannot be patented either The disc which embodies the programs in my view has its structure altered, albeit differently, just as does the chip which embodies the programs. Both are useless until they are altered so as to embody the programs. When so altered they have the self-same end result…

 

A computer program remains a computer program whether a contained in software or hardware.” (emphasis added)

 

In Gale’s Application, the Court of Appeal (a higher authority than the High Court) apparently starts from the proviso that, when considering whether a “conventional” hardware implementation of a ‘program’ escapes the exclusion of Section 1(2)(c), the correct starting point is the “software” implementation; if the software implementation is excluded, so too apparently is the generic hardware implementation.  By contrast, Emotional Perception, starts from the ‘hardware’ version, and then frames the software version as an emulation of the hardware.

 

The reasoning of para. 56 of Emotional Perception is interesting:

 

“It seems to me that it is appropriate to look at the emulated ANN as, in substance, operating at a different level (albeit metaphorically) from the underlying software on the computer, and it is operating in the same way as the hardware ANN.” (emphasis added)

 

The ‘metaphor’ is not made explicit.  Neural networks are sometimes visualized as a graph of nodes and edges.  To a computer scientist, this visual metaphor is termed a computational graph, and is simply a visual representation of the mathematical expressions defining the operation of the neural network.  Every computer program (machine learning or not) can be representing using the same visual metaphor (including, for example, the square root calculation algorithm of Gale’s application).  It is presumably not the case that any computer program can now be considered an emulation of its ‘metaphorical’ hardware computational graph, as that would apparently render the exclusion of Section 1(2)(c) meaningless, and would appear to directly contradict Gale’s application.

 

On the other hand, if the ‘metaphor’ refers to the parallels that are sometimes drawn between an ANN and the human brain (acknowledged in para. 15 of the Emotional Perception judgement), this could instead be interpreted as carving out a ‘special status’ for ANNs.  This is perhaps more reconcilable with Gale’s application, although it remains to be seen what other ‘metaphors’ might be sufficient to escape the exclusion of Section 1(2)(c) in the future.

The UKIPO has often faced the question of computer “hardware” vs. “software”, including recently in Imagination Technologies Limited (decision O/420/21 of the Hearing Officer). Distinguishing over Gale’s “hardware” implementation using “conventional” Read Only Memory, the Hearing Officer construed “fixed function circuitry” as “a specific piece of hardware which is not programmed or programmable in any way” that does not invoke the computer program exclusion (para. 19), emphasising the absence of any “processor which needs to be told what to do with any stored instructions” (a definition that is not necessarily straightforward to apply in practice).  It remains to be seen how such reasoning will be impacted by the “hardware emulation” findings of Emotional Perception.

 

b. Model training

With regards to training, the judgement recognizes that the training of an ANN “involves some programming activity” (para. 60).  Para. 61 concludes:

 

“So far, then, there is a computer program involved [in training], at least at that level. However, it does not seem to me that the claim claims that program. What is said to be special is the idea of using pairs of files for training, and setting the training objective and parameters accordingly. If that is right, and I consider it is, then the actual program is a subsidiary part of the claim and is not what is claimed. The claims go beyond that. The idea of the parameters itself is not necessarily part of the program. On this footing as a matter of construction the claim is not to a computer program at all. The exclusion is not invoked.”

 

Interestingly, this part of the judgement does not appear to rest on the ”software emulation” point, or, indeed, anything specific to artificial neural networks.  Aside from the reference to “pairs of files for training” – apparently referencing a particular training setup summarized in paras. 10-11, which itself appears to be a form of so-called contrastive learning – this description appears to broadly characterize any machine learning progress. 

 

The idea that the ‘parameters’ might not form part of the training computer program is interesting.  Would this apply, for example, to the ‘learning’ of a simple linear regression model, or – to put it in layman terms – fitting a line or plane (y=ax+b) to a set of points? 

 

c. ‘Subjective’ or ‘cognitive’ benefits

Paras. 63-76 consider the question of the Section 1(2)(c) exclusion if the broader conclusions on ANNs turn out to be wrong.  Whilst less impactful on the face of it, these conclusions are arguably applicable to a broad class inventions concerning AI-based recommendations communicated within a computer network, and potentially even more broadly applicable to inventions concerning ‘cognitive’ or ‘subjective’ benefits or criteria.

 

Noting in para. 68 that “[t]he technical effect relied on by Emotional Perception in this respect is the sending of an improved recommendation message”, the Court concludes in para. 76:

 

“The correct view of what happened, for these purposes, is that a file has been identified, and then moved, because it fulfilled certain criteria. True it is that those criteria are not technical criteria in the sense that they can be described in purely technical terms, but they are criteria nonetheless, and the ANN has certainly gone about its analysis and selection in a technical way. It is not just any old file; it is a file identified as being semantically similar by the application of technical criteria which the system has worked out for itself. So the output is of a file that would not otherwise be selected. That seems to me to be a technical effect outside the computer for these purposes, and when coupled with the purpose and method of selection it fulfils the requirement of technical effect in order to escape the exclusion. I do not see why the possible subjective effect within a user’s own non-artificial neural network should disqualify it for these purposes.” (emphasis added)

 

This line of reasoning is expressly decoupled from the ‘software emulation point’ and has potentially wide reaching-consequences for inventions that utilize AI to generate recommendation messages within a computer network, such as recommending items to add to a shopping cart during online checkout, targeting adverts to a particular user, or other fields of application not historically regarded as “technical”.

 

The Hearing Officer’s conclusion that such “beneficial effect is of a subjective and cognitive nature” and therefore non-technical was expressly rejected (para. 29).  Inventions based on “cognitive” or “subjective” criteria or benefits are routinely rejected at the EPO, where – in apparent contrast to Emotional perception - the learning of semantic relationships through training is not regarded as a technical activity unless the trained model supports a technical purpose [3].

 

For more information on patents relating to AI and machine learning, please contact Tom Woodhouse

 

This briefing is for general information purposes only and should not be used as a substitute for legal advice relating to your particular circumstances. We can discuss specific issues and facts on an individual basis. Please note that the law may have changed since the day this was first published in December 2023.

 

[1] https://www.gov.uk/government/publications/examining-patent-applications-relating-to-artificial-intelligence-ai-inventions

 

[2] Examination of patent applications involving artificial neural networks (ANN) - GOV.UK (www.gov.uk)

 

[3] EPO Guidelines for Examination G-II 3.3.1 Artificial intelligence and machine learning

Author

Our offices

London

+44 20 7831 7929

Bedford House, John Street, London, WC1N 2BF, United Kingdom

Leeds

+44 20 7831 7929

Suite 3.05, Platform, New Station Street, Leeds, LS1 4JB, United Kingdom

Exeter

+44 20 7831 7929

Generator Hub, The Gallery, Kings Wharf, The Quay, Exeter, EX2 4AN, United Kingdom

Munich

+49 89 5150 5800

Widenmayerstr. 10, D-80538 München, Germany