Programming Languages and Systems Ilya Sergey - Read the ebook online or download it to own the complete version
Programming Languages and Systems Ilya Sergey - Read the ebook online or download it to own the complete version
com
https://ebookmeta.com/product/programming-languages-and-
systems-ilya-sergey/
OR CLICK HERE
DOWLOAD EBOOK
https://ebookmeta.com/product/programming-languages-application-and-
interpretation-printing-shriram-krishnamurthi/
ebookmeta.com
https://ebookmeta.com/product/programming-languages-build-prove-and-
compare-norman-ramsey/
ebookmeta.com
https://ebookmeta.com/product/programming-languages-principles-and-
paradigms-2nd-edition-maurizio-gabbrielli/
ebookmeta.com
https://ebookmeta.com/product/cyber-crime-investigators-field-
guide-3rd-edition-bruce-middleton/
ebookmeta.com
Engaging with Brecht. Making Theatre in the Twenty-first
Century 1st Edition Bill Gelber
https://ebookmeta.com/product/engaging-with-brecht-making-theatre-in-
the-twenty-first-century-1st-edition-bill-gelber/
ebookmeta.com
https://ebookmeta.com/product/mothership-haunting-of-ypsilon-4-1st-
edition-sean-mccay/
ebookmeta.com
https://ebookmeta.com/product/ai-powered-business-intelligence-1st-
edition-tobias-zwingmann/
ebookmeta.com
https://ebookmeta.com/product/handmade-soap-book-easy-soapmaking-with-
natural-ingredients-2nd-edition-melinda-coss/
ebookmeta.com
https://ebookmeta.com/product/the-black-elfstone-book-one-of-the-fall-
of-shannara-1st-edition-terry-brooks/
ebookmeta.com
Making Faithful Decisions at the End of Life 3rd Edition
Nancy J Duff
https://ebookmeta.com/product/making-faithful-decisions-at-the-end-of-
life-3rd-edition-nancy-j-duff/
ebookmeta.com
ARCoSS Ilya Sergey (Ed.)
Programming
LNCS 13240
Languages
and Systems
31st European Symposium on Programming, ESOP 2022
Held as Part of the European Joint Conferences
on Theory and Practice of Software, ETAPS 2022
Munich, Germany, April 2–7, 2022
Proceedings
Lecture Notes in Computer Science 13240
Founding Editors
Gerhard Goos, Germany
Juris Hartmanis, USA
Programming
Languages
and Systems
31st European Symposium on Programming, ESOP 2022
Held as Part of the European Joint Conferences
on Theory and Practice of Software, ETAPS 2022
Munich, Germany, April 2–7, 2022
Proceedings
123
Editor
Ilya Sergey
National University of Singapore
Singapore, Singapore
This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
ETAPS Foreword
Welcome to the 25th ETAPS! ETAPS 2022 took place in Munich, the beautiful capital
of Bavaria, in Germany.
ETAPS 2022 is the 25th instance of the European Joint Conferences on Theory and
Practice of Software. ETAPS is an annual federated conference established in 1998,
and consists of four conferences: ESOP, FASE, FoSSaCS, and TACAS. Each
conference has its own Program Committee (PC) and its own Steering Committee
(SC). The conferences cover various aspects of software systems, ranging from theo-
retical computer science to foundations of programming languages, analysis tools, and
formal approaches to software engineering. Organizing these conferences in a coherent,
highly synchronized conference program enables researchers to participate in an
exciting event, having the possibility to meet many colleagues working in different
directions in the field, and to easily attend talks of different conferences. On the
weekend before the main conference, numerous satellite workshops took place that
attract many researchers from all over the globe.
ETAPS 2022 received 362 submissions in total, 111 of which were accepted,
yielding an overall acceptance rate of 30.7%. I thank all the authors for their interest in
ETAPS, all the reviewers for their reviewing efforts, the PC members for their con-
tributions, and in particular the PC (co-)chairs for their hard work in running this entire
intensive process. Last but not least, my congratulations to all authors of the accepted
papers!
ETAPS 2022 featured the unifying invited speakers Alexandra Silva (University
College London, UK, and Cornell University, USA) and Tomáš Vojnar (Brno
University of Technology, Czech Republic) and the conference-specific invited
speakers Nathalie Bertrand (Inria Rennes, France) for FoSSaCS and Lenore Zuck
(University of Illinois at Chicago, USA) for TACAS. Invited tutorials were provided by
Stacey Jeffery (CWI and QuSoft, The Netherlands) on quantum computing and
Nicholas Lane (University of Cambridge and Samsung AI Lab, UK) on federated
learning.
As this event was the 25th edition of ETAPS, part of the program was a special
celebration where we looked back on the achievements of ETAPS and its constituting
conferences in the past, but we also looked into the future, and discussed the challenges
ahead for research in software science. This edition also reinstated the ETAPS men-
toring workshop for PhD students.
ETAPS 2022 took place in Munich, Germany, and was organized jointly by the
Technical University of Munich (TUM) and the LMU Munich. The former was
founded in 1868, and the latter in 1472 as the 6th oldest German university still running
today. Together, they have 100,000 enrolled students, regularly rank among the top
100 universities worldwide (with TUM’s computer-science department ranked #1 in
the European Union), and their researchers and alumni include 60 Nobel laureates.
vi ETAPS Foreword
The local organization team consisted of Jan Křetínský (general chair), Dirk Beyer
(general, financial, and workshop chair), Julia Eisentraut (organization chair), and
Alexandros Evangelidis (local proceedings chair).
ETAPS 2022 was further supported by the following associations and societies:
ETAPS e.V., EATCS (European Association for Theoretical Computer Science),
EAPLS (European Association for Programming Languages and Systems), and EASST
(European Association of Software Science and Technology).
The ETAPS Steering Committee consists of an Executive Board, and representa-
tives of the individual ETAPS conferences, as well as representatives of EATCS,
EAPLS, and EASST. The Executive Board consists of Holger Hermanns
(Saarbrücken), Marieke Huisman (Twente, chair), Jan Kofroň (Prague), Barbara König
(Duisburg), Thomas Noll (Aachen), Caterina Urban (Paris), Tarmo Uustalu (Reykjavik
and Tallinn), and Lenore Zuck (Chicago).
Other members of the Steering Committee are Patricia Bouyer (Paris), Einar Broch
Johnsen (Oslo), Dana Fisman (Be’er Sheva), Reiko Heckel (Leicester), Joost-Pieter
Katoen (Aachen and Twente), Fabrice Kordon (Paris), Jan Křetínský (Munich), Orna
Kupferman (Jerusalem), Leen Lambers (Cottbus), Tiziana Margaria (Limerick),
Andrew M. Pitts (Cambridge), Elizabeth Polgreen (Edinburgh), Grigore Roşu (Illinois),
Peter Ryan (Luxembourg), Sriram Sankaranarayanan (Boulder), Don Sannella
(Edinburgh), Lutz Schröder (Erlangen), Ilya Sergey (Singapore), Natasha Sharygina
(Lugano), Pawel Sobocinski (Tallinn), Peter Thiemann (Freiburg), Sebastián Uchitel
(London and Buenos Aires), Jan Vitek (Prague), Andrzej Wasowski (Copenhagen),
Thomas Wies (New York), Anton Wijs (Eindhoven), and Manuel Wimmer (Linz).
I’d like to take this opportunity to thank all authors, attendees, organizers of the
satellite workshops, and Springer-Verlag GmbH for their support. I hope you all
enjoyed ETAPS 2022.
Finally, a big thanks to Jan, Julia, Dirk, and their local organization team for all their
enormous efforts to make ETAPS a fantastic event.
This volume contains the papers accepted at the 31st European Symposium on
Programming (ESOP 2022), held during April 5–7, 2022, in Munich, Germany
(COVID-19 permitting). ESOP is one of the European Joint Conferences on Theory
and Practice of Software (ETAPS); it is dedicated to fundamental issues in the spec-
ification, design, analysis, and implementation of programming languages and systems.
The 21 papers in this volume were selected by the Program Committee (PC) from
64 submissions. Each submission received between three and four reviews. After
receiving the initial reviews, the authors had a chance to respond to questions and
clarify misunderstandings of the reviewers. After the author response period, the papers
were discussed electronically using the HotCRP system by the 33 Program Committee
members and 33 external reviewers. Two papers, for which the PC chair had a conflict
of interest, were kindly managed by Zena Ariola. The reviewing for ESOP 2022 was
double-anonymous, and only authors of the eventually accepted papers have been
revealed.
Following the example set by other major conferences in programming languages,
for the first time in its history, ESOP featured optional artifact evaluation. Authors
of the accepted manuscripts were invited to submit artifacts, such as code, datasets, and
mechanized proofs, that supported the conclusions of their papers. Members of the
Artifact Evaluation Committee (AEC) read the papers and explored the artifacts,
assessing their quality and checking that they supported the authors’ claims. The
authors of eleven of the accepted papers submitted artifacts, which were evaluated by
20 AEC members, with each artifact receiving four reviews. Authors of papers with
accepted artifacts were assigned official EAPLS artifact evaluation badges, indicating
that they have taken the extra time and have undergone the extra scrutiny to prepare a
useful artifact. The ESOP 2022 AEC awarded Artifacts Functional and Artifacts
(Functional and) Reusable badges. All submitted artifacts were deemed Functional, and
all but one were found to be Reusable.
My sincere thanks go to all who contributed to the success of the conference and to
its exciting program. This includes the authors who submitted papers for consideration;
the external reviewers who provided timely expert reviews sometimes on very short
notice; the AEC members and chairs who took great care of this new aspect of ESOP;
and, of course, the members of the ESOP 2022 Program Committee. I was extremely
impressed by the excellent quality of the reviews, the amount of constructive feedback
given to the authors, and the criticism delivered in a professional and friendly tone.
I am very grateful to Andreea Costea and KC Sivaramakrishnan who kindly agreed to
serve as co-chairs for the ESOP 2022 Artifact Evaluation Committee. I would like to
thank the ESOP 2021 chair Nobuko Yoshida for her advice, patience, and the many
insightful discussions on the process of running the conference. I thank all who con-
tributed to the organization of ESOP: the ESOP steering committee and its chair Peter
Thiemann, as well as the ETAPS steering committee and its chair Marieke Huisman.
viii Preface
Finally, I would like to thank Barbara König and Alexandros Evangelidis for their help
with assembling the proceedings.
Program Chair
Ilya Sergey National University of Singapore, Singapore
Program Committee
Michael D. Adams Yale-NUS College, Singapore
Danel Ahman University of Ljubljana, Slovenia
Aws Albarghouthi University of Wisconsin-Madison, USA
Zena M. Ariola University of Oregon, USA
Ahmed Bouajjani Université de Paris, France
Giuseppe Castagna CNRS, Université de Paris, France
Cristina David University of Bristol, UK
Mariangiola Dezani Università di Torino, Italy
Rayna Dimitrova CISPA Helmholtz Center for Information Security,
Germany
Jana Dunfield Queen’s University, Canada
Aquinas Hobor University College London, UK
Guilhem Jaber Université de Nantes, France
Jeehoon Kang KAIST, South Korea
Ekaterina Komendantskaya Heriot-Watt University, UK
Ori Lahav Tel Aviv University, Israel
Ivan Lanese Università di Bologna, Italy, and Inria, France
Dan Licata Wesleyan University, USA
Sam Lindley University of Edinburgh, UK
Andreas Lochbihler Digital Asset, Switzerland
Cristina Lopes University of California, Irvine, USA
P. Madhusudan University of Illinois at Urbana-Champaign, USA
Stefan Marr University of Kent, UK
James Noble Victoria University of Wellington, New Zealand
Burcu Kulahcioglu Ozkan Delft University of Technology, The Netherlands
Andreas Pavlogiannis Aarhus University, Denmark
Vincent Rahli University of Birmingham, UK
Robert Rand University of Chicago, USA
Christine Rizkallah University of Melbourne, Australia
Alejandro Russo Chalmers University of Technology, Sweden
Gagandeep Singh University of Illinois at Urbana-Champaign, USA
Gordon Stewart BedRock Systems, USA
Joseph Tassarotti Boston College, USA
Bernardo Toninho Universidade NOVA de Lisboa, Portugal
x Organization
Additional Reviewers
Andreas Abel Gothenburg University, Sweden
Guillaume Allais University of St Andrews, UK
Kalev Alpernas Tel Aviv University, Israel
Davide Ancona Università di Genova, Italy
Stephanie Balzer Carnegie Mellon University, USA
Giovanni Bernardi Université de Paris, France
Soham Chakraborty Delft University of Technology, The Netherlands
Arthur Chargueraud Inria, France
Ranald Clouston Australian National University, Australia
Fredrik Dahlqvist University College London, UK
Olivier Danvy Yale-NUS College, Singapore
Benjamin Delaware Purdue University, USA
Dominique Devriese KU Leuven, Belgium
Paul Downen University of Massachusetts, Lowell, USA
Yannick Forster Saarland University, Germany
Milad K. Ghale University of New South Wales, Australia
Kiran Gopinathan National University of Singapore, Singapore
Tristan Knoth University of California, San Diego, USA
Paul Levy University of Birmingham, UK
Umang Mathur National University of Singapore, Singapore
McKenna McCall Carnegie Mellon University, USA
Garrett Morris University of Iowa, USA
Fredrik Nordvall Forsberg University of Strathclyde, UK
José N. Oliveira University of Minho, Portugal
Alex Potanin Australian National University, Australia
Susmit Sarkar University of St Andrews, UK
Filip Sieczkowski Heriot-Watt University, UK
Kartik Singhal University of Chicago, USA
Sandro Stucki Chalmers University of Technology and University
of Gothenburg, Sweden
Amin Timany Aarhus University, Denmark
Klaus v. Gleissenthall Vrije Universiteit Amsterdam, The Netherlands
Thomas Wies New York University, USA
Vladimir Zamdzhiev Inria, Loria, Université de Lorraine, France
1 Introduction
The last decade has witnessed a surge of interest in machine learning, fuelled by
the numerous successes and applications that these methodologies have found in
many fields of science and technology. As machine learning techniques become
increasingly pervasive, algorithms and models become more sophisticated, posing
a significant challenge both to the software developers and the users that need to
interface, execute and maintain these systems. In spite of this rapidly evolving
picture, the formal analysis of many learning algorithms mostly takes place at a
heuristic level [41], or using definitions that fail to provide a general and scalable
framework for describing machine learning. Indeed, it is commonly acknowledged
through academia, industry, policy makers and funding agencies that there is a
pressing need for a unifying perspective, which can make this growing body of
work more systematic, rigorous, transparent and accessible both for users and
developers [2, 36].
Consider, for example, one of the most common machine learning scenar-
ios: supervised learning with a neural network. This technique trains the model
towards a certain task, e.g. the recognition of patterns in a data set (cf. Fig-
ure 1). There are several different ways of implementing this scenario. Typically,
at their core, there is a gradient update algorithm (often called the “optimiser”),
depending on a given loss function, which updates in steps the parameters of the
network, based on some learning rate controlling the “scaling” of the update. All
c The Author(s) 2022
I. Sergey (Ed.): ESOP 2022, LNCS 13240, pp. 1–28, 2022.
https://doi.org/10.1007/978-3-030-99336-8_1
2 Cruttwell, Gavranović, Ghani, Wilson, and Zanasi
that f (p, −) is the best function according to some criteria. Specifically, the
weights on the internal nodes of a neural network are a parameter which the
learning is seeking to optimize. Parameters also arise elsewhere, e.g. in the
loss function (see later).
(II) information flows bidirectionally: in the forward direction, the computa-
tion turns inputs via a sequence of layers into predicted outputs, and then
into a loss value; in the reverse direction, backpropagation is used propa-
gate the changes backwards through the layers, and then turn them into
parameter updates.
(III) the basis of parameter update via gradient descent is differentiation e.g.
in the simple case we differentiate the function mapping a parameter to its
associated loss to reduce that loss.
We model bidirectionality via lenses [6, 12, 29] and based upon the above
three insights, we propose the notion of parametric lens as the fundamental
semantic structure of learning. In a nutshell, a parametric lens is a process with
three kinds of interfaces: inputs, outputs, and parameters. On each interface,
information flows both ways, i.e. computations are bidirectional. These data
are best explained with our graphical representation of parametric lenses, with
inputs A, A′ , outputs B, B ′ , parameters P , P ′ , and arrows indicating information
flow (below left). The graphical notation also makes evident that parametric
lenses are open systems, which may be composed along their interfaces (below
center and right).
Q Q′
′ ′ Q Q ′
P P P P
B
A B A C
(1)
A′ B′ A′ C′ P P′
B′
A B
A′ B′
This pictorial formalism is not just an intuitive sketch: as we will show, it can
be understood as a completely formal (graphical) syntax using the formalism of
string diagrams [39], in a way similar to how other computational phenomena
have been recently analysed e.g. in quantum theory [14], control theory [5, 8],
and digital circuit theory [26].
It is intuitively clear how parametric lenses express aspects (I) and (II) above,
whereas (III) will be achieved by studying them in a space of ‘differentiable
objects’ (in a sense that will be made precise). The main technical contribution
of our paper is showing how the various ingredients involved in learning (the
model, the optimiser, the error map and the learning rate) can be uniformly
understood as being built from parametric lenses.
We will use category theory as the formal language to develop our notion of
parametric lenses, and make Figure 2 mathematically precise. The categorical
perspective brings several advantages, which are well-known, established princi-
ples in programming language semantics [3,40,49]. Three of them are particularly
4 Cruttwell, Gavranović, Ghani, Wilson, and Zanasi
A P P
B
Optimiser
P P′
B′
A B L
Learning
Model Loss
rate
A′ B′ L′
Fig. 2: The parametric lens that captures the learning process informally sketched
in Figure 1. Note each component is a lens itself, whose composition yields the
interactions described in Figure 1. Defining this picture formally will be the
subject of Sections 3-4.
2 Categorical Toolkit
Example 1. Take the category Smooth whose objects are natural numbers and
whose morphisms f : n → m are smooth maps from Rn to Rm . As described
above, the category Para(Smooth) can be thought of as a category of neural
networks: a map in this category from n to m consists of a choice of p and a
map f : Rp × Rn → Rm with Rp representing the set of possible weights of the
neural network.
As we will see in the next sections, the interplay of the various components
at work in the learning process becomes much clearer once represented the mor-
phisms of Para(C) using the pictorial formalism of string diagrams, which we
now recall. In fact, we will mildly massage the traditional notation for string
diagrams (below left), by representing a morphism f : A → B in Para(C) as
below right.
P
P
f B A f B
A
This is to emphasise the special role played by P , reflecting the fact that in
machine learning data and parameters have different semantics. String diagram-
matic notations also allows to neatly represent composition of maps (P, f ) : A →
B and (P ′ , f ′ ) : B → C (below left), and “reparameterisation” of (P, f ) : A → B
by a map α : Q → P (below right), yielding a new map (Q, (α⊗1A ); f ) : A → B.
P P′ α
(2)
P
B
A f f′ C A f B
4
One can also define Para(C) in the case when C is non-strict; however, the result
would be not a category but a bicategory.
8 Cruttwell, Gavranović, Ghani, Wilson, and Zanasi
2.2 Lenses
f B
A
A ∗ B
(f, f )
A′ B′
A′ f∗
B′
f B g C
A
(3)
′ ∗ ∗
A f g
B′ C′
Categorical Foundations of Gradient-Based Learning 9
The fundamental category where supervised learning takes place is the composite
Para(Lens(C)) of the two constructions in the previous sections:
For f : A → B, the pair (f, R[f ]) forms a lens from (A, A) to (B, B). We
will pursue the idea that R[f ] acts as backwards map, thus giving a means to
“learn”f .
5
In [23], these are called learners. However, in this paper we study them in a much
broader light; see Section 6.
10 Cruttwell, Gavranović, Ghani, Wilson, and Zanasi
Note that assigning type A×B → A to R[f ] hides some relevant information:
B-values in the domain and A-values in the codomain of R[f ] do not play the
same role as values of the same types in f : A → B: in R[f ], they really take in a
tangent vector at B and output a tangent vector at A (cf. the definition of R[f ]
in Smooth, Example 2 below). To emphasise this, we will type R[f ] as a map
A × B ′ → A′ (even though in reality A = A′ and B = B ′ ), thus meaning that
(f, R[f ]) is actually a lens from (A, A′ ) to (B, B ′ ). This typing distinction will
be helpful later on, when we want to add additional components to our learning
algorithms.
The following two examples of CRDCs will serve as the basis for the learning
scenarios of the upcoming sections.
P P′
P f B
A
A f B 7→ (4)
A′ R[f ]
B′
parametric map from B to R with parameter space B.6 We also generalize the
codomain to an arbitrary object L.
Note that we can precompose a loss map (B, loss) : B → L with a neural
network (P, f ) : A → B (below left), and apply the functor in (4) (with C =
Smooth) to obtain the parametric lens below right.
P P′ B B′
P B B
A f loss L (5)
B 7→
A f loss L A′ R[f ] R[loss] L′
B′
This is getting closer to the parametric lens we want: it can now receive
inputs of type B. However, this is at the cost of now needing an input to L′ ; we
consider how to handle this in the next section.
Example 9 (Dot product). In Deep Dreaming (Section 4.2) we often want to focus
only on a particular element of the network output Rb . This is done by supplying
a one-hot vector bt as the ground truth to the loss function e(bt , bp ) = bt ·bp which
computes the dot product of two vectors. If the ground truth vector y is a one-
hot vector (active at the i-th element), then the dot product performs masking of
all inputs except the i-th one. Note the reverse derivative R[e] : Rb × Rb × R →
Rb × Rb of the dot product is defined as R[e](bt , bp , α) = (α · bp , α · bt ).
B L
A f loss
α (6)
A′ R[f ] R[loss]
B′ L′
Example 10. In standard supervised learning in Smooth, one fixes some ϵ > 0
as a learning rate, and this is used to define α: α is simply constantly −ϵ, ie.,
α(l) = −ϵ for any l ∈ L.
Example 11. In supervised learning in POLY Z2 , the standard learning rate is
quite different: for a given L it is defined as the identity function, α(l) = l.
Other learning rate morphisms are possible as well: for example, one could
fix some ϵ > 0 and define a learning rate in Smooth by α(l) = −ϵ · l. Such a
choice would take into account how far away the network is from its desired goal
and adjust the learning rate accordingly.
Intuitively, such a lens allows one to receive the requested change in parameter
and implement that change by adding that value to the current parameter. By its
type, we can now “plug” the gradient descent lens G : (P, P ) → (P, P ′ ) above the
model (f, R[f ]) in (4) — formally, this is accomplished as a reparameterisation
of the parametric morphism (f, R[f ]), cf. Section 2.1. This gives us Figure 3
(left).
P P S×P S×P
+ Optimiser
P P′ P P′
A B A B
Model Model
A′ B′ A′ B′
Other variants of gradient descent also fit naturally into this framework by
allowing for additional input/output data with P . In particular, many of them
keep track of the history of previous updates and use that to inform the next one.
This is easy to model in our setup: instead of asking for a lens (P, P ) → (P, P ′ ),
we ask instead for a lens (S ×P, S ×P ) → (P, P ′ ) where S is some “state” object.
7
Note that as in the discussion in Section 2.4, we are implicitly assuming that P = P ′ ;
we have merely notated them differently to emphasize the different “roles” they play
(the first P can be thought of as “points”, the second as “vectors”)
Categorical Foundations of Gradient-Based Learning 15
A A′ P P′ B B′
A B L
Model Loss α (7)
A′ B′ L′
This composite is now a map in Para(Lens(C)) from (1, 1) to (1, 1); all its inputs
and outputs are now vertical wires, ie., parameters. Unpacking it further, this is
a lens of type (A × P × B, A′ × P ′ × B ′ ) → (1, 1) whose get map is the terminal
map, and whose put map is of the type A × P × B → A′ × P ′ × B ′ . It can be
unpacked as the composite put(a, p, bt ) = (a′ , p′ , b′t ), where
bp = f (p, a) (b′t , b′p ) = R[loss](bt , bp , α(loss(bt , bp ))) (p′ , a′ ) = R[f ](p, a, b′p ).
In the next two sections we consider further additions to the image above which
correspond to different types of supervised learning.
A S×P S×P
B
Optimiser
P P′
B′
A B L
Model Loss α
′ ′ ′
A B L
p = U (s, p) bp = f (p, a)
(b′t , b′p ) = R[loss](bt , bp , α(loss(bt , bp ))) (p′ , a′ ) = R[f ](p, a, b′p ).
While this formulation might seem daunting, we note that it just explicitly
specifies the computation performed by a supervised learning system. The vari-
able p represents the parameter supplied to the network by the stateful gradient
update rule (in many cases this is equal to p); bp represents the prediction of
the network (contrast this with bt which represents the ground truth from the
dataset). Variables with a tick ′ represent changes: b′p and b′t are the changes
on predictions and true values respectively, while p′ and a′ are changes on the
parameters and inputs. Furthermore, this arises automatically out of the rule for
lens composition (3); what we needed to specify is just the lenses themselves.
We justify and illustrate our approach on a series of case studies drawn from
the literature. This presentation has the advantage of treating all these instances
uniformly in terms of basic constructs, highlighting their similarities and differ-
ences. First, we fix some parametric map (Rp , f ) : Para(Smooth)(Ra , Rb ) in
Smooth and the constant negative learning rate α : R (Example 10). We then
vary the loss function and the gradient update, seeing how the put map above
reduces to many of the known cases in the literature.
Example 18 (Quadratic error, basic gradient descent). Fix the quadratic error
(Example 6) as the loss map and basic gradient update (Example 12). Then the
aforementioned put map simplifies. Since there is no state, its type reduces to
A × P × B → P , and we have put(a, p, bt ) = p + p′ , where (p′ , a′ ) = R[f ](p, a, α ·
(f (p, a) − bt )). Note that α here is simply a constant, and due to the linearity
of the reverse derivative (Def 4), we can slide the α from the costate into the
basic gradient update lens. Rewriting this update, and performing this sliding we
obtain a closed form update step put(a, p, bt ) = p+α·(R[f ](p, a, f (p, a)−bt ); π0 ),
18 Cruttwell, Gavranović, Ghani, Wilson, and Zanasi
Example 19 (Softmax cross entropy, basic gradient descent). Fix Softmax cross
entropy (Example 8) as the loss map and basic gradient update (Example 12).
Again the put map simplifies. The type reduces to A × P × B → P and we have
put(a, p, bt ) = p + p′ where (p′ , a′ ) = R[f ](p, a, α · (Softmax(f (p, a)) − bt )). The
same rewriting performed on the previous example can be done here.
Example 20 (Mean squared error, Nesterov Momentum). Fix the quadratic error
(Example 6) as the loss map and Nesterov momentum (Example 15) as the
gradient update. This time the put map A × S × P × B → S × P does not have a
simplified type. The implementation of put reduces to put(a, s, p, bt ) = (s′ , p+s′ ),
where p = p + γs, (p′ , a′ ) = R[f ](p, a, α · (f (p, a) − bt )), and s′ = −γs + p′ .
This example with Nesterov momentum differs in two key points from all
the other ones: i) the optimiser is stateful, and ii) its get map is not trivial.
While many other optimisers are stateful, the non-triviality of the get map here
showcases the importance of lenses. They allow us to make precise the notion of
computing a “lookahead” value for Nesterov momentum, something that is in
practice usually handled in ad-hoc ways. Here, the algebra of lens composition
handles this case naturally by using the get map, a seemingly trivial, unused
piece of data for previous optimisers.
Our last example, using a different base category POLY Z2 , shows that our
framework captures learning in not just continuous, but discrete settings too.
Again, we fix a parametric map (Zp , f ) : POLYZ2 (Za , Zb ) but this time we fix
the identity learning rate (Example 11), instead of a constant one.
Example 21 (Basic learning in Boolean circuits). Fix XOR as the loss map (Ex-
ample 7) and the basic gradient update (Example 13). The put map again
simplifies. The type reduces to A × P × B → P and the implementation to
put(a, p, bt ) = p + p′ where (p′ , a′ ) = R[f ](p, a, f (p, a) + bt ).
here modelled without state S). This map takes an input-output pair (a0 , b0 ),
the current parameter pi and produces an updated parameter pi+1 . At the next
time step, it takes a potentially different input-output pair (a1 , b1 ), the updated
parameter pi+1 and produces pi+2 . This process is then repeated. We can model
this iteration as a composition of the put map with itself, as a composite (A ×
put × B); put whose type is A × A × P × B × B → P . This map takes two input-
output pairs A × B, a parameter and produces a new parameter by processing
these datapoints in sequence. One can see how this process can be iterated any
number of times, and even represented as a string diagram.
But we note that with a slight reformulation of the put map, it is possible
to obtain a conceptually much simpler definition. The key insight lies in seeing
that the map put : A × P × B → P is essentially an endo-map P → P with some
extra inputs A × B; it’s a parametric map!
In other words, we can recast the put map as a parametric map (A × B, put) :
Para(C)(P, P ). Being an endo-map, it can be composed with itself. The resulting
composite is an endo-map taking two “parameters”: input-output pair at the
time step 0 and time step 1. This process can then be repeated, with Para
composition automatically taking care of the algebra of iteration.
P
P put put . n. . put P
This reformulation captures the essence of parameter iteration: one can think
of it as a trajectory pi , pi+1 , pi+2 , ... through the parameter space; but it is a
trajectory parameterised by the dataset. With different datasets the algorithm
will take a different path through this space and learn different things.
We have seen that reparameterising the parameter port with gradient descent
allows us to capture supervised parameter learning. In this section we describe
how reparameterising the input port provides us with a way to enhance an input
image to elicit a particular interpretation. This is the idea behind the technique
called Deep Dreaming, appearing in the literature in many forms [19, 34, 35, 44].
S×A S×A P B
Optimiser
A A′
B′
A B L
5 Implementation
We model a lens (f, f ∗ ) in our library with the Lens class, which consists of a
pair of maps fwd and rev corresponding to f and f ∗ , respectively. For example,
we write the identity lens (1A , π2 ) as follows:
i d e n t i t y = Lens ( lambda x : x , lambda x dy : x dy [ 1 ] )
Let us now see how to construct a single layer neural network from the com-
position of such primitives. Diagramatically, we wish to construct the following
model, representing a single ‘dense’ layer of a neural network:
Rb×a Rb×a Rb Rb
Rb Rb
Ra Rb
linear bias activation (9)
Ra Rb
Rb Rb
Here, the parameters of linear are the coefficients of a b × a matrix, and the
underlying lens has as its forward map the function (M, x) → M · x, where M is
the b × a matrix whose coefficients are the Rb×a parameters, and x ∈ Ra is the
input vector. The bias map is even simpler: the forward map of the underlying
lens is simply pointwise addition of inputs and parameters: (b, x) → b+x. Finally,
the activation map simply applies a nonlinear function (e.g., sigmoid) to the
input, and thus has the trivial (unit) parameter space. The representation of
this composition in code is straightforward: we can simply compose the three
primitive Para maps as in (9):
def d e n s e ( a , b , a c t i v a t i o n ) :
return l i n e a r ( a , b ) >> b i a s ( b ) >> a c t i v a t i o n
5.2 Learning
Now that we have constructed a model, we also need to use it to learn from
data. Concretely, we will construct a full parametric lens as in Figure 2 then
extract its put map to iterate over the dataset.
By way of example, let us see how to construct the following parametric lens,
representing basic gradient descent over a single layer neural network with a
fixed learning rate:
A P P B
P P′
B′
A B L
dense loss (10)
ϵ
A′ B′ L′
Categorical Foundations of Gradient-Based Learning 23
Now, given the parametric lens of (10), one can construct a morphism step :
B ×P ×A → P which is simply the put map of the lens. Training the model then
consists of iterating the step function over dataset examples (x, y) ∈ A×B to op-
timise some initial choice of parameters θ0 ∈ P , by letting θi+1 = step(yi , θi , xi ).
Note that our library also provides a utility function to construct step from
its various pieces:
s t e p = s u p e r v i s e d s t e p ( model , update , l o s s , l e a r n i n g r a t e )
6 Related Work
The work [23] is closely related to ours, in that it provides an abstract categorical
model of backpropagation. However, it differs in a number of key aspects. We
give a complete lens-theoretic explanation of what is back-propagated via (i)
the use of CRDCs to model gradients; and (ii) the Para construction to model
parametric functions and parameter update. We thus can go well beyond [23]
in terms of examples - their example of smooth functions and basic gradient
descent is covered in our subsection 4.1.
We also explain some of the constructions of [23] in a more structured way.
For example, rather than considering the category Learn of [23] as primitive,
here we construct it as a composite of two more basic constructions (the Para
and Lens constructions). The flexibility could be used, for example, to com-
positionally replace Para with a variant allowing parameters to come from a
different category, or lenses with the category of optics [38] enabling us to model
things such as control flow using prisms.
One more relevant aspect is functoriality. We use a functor to augment a
parametric map with its backward pass, just like [23]. However, they additionally
augmented this map with a loss map and gradient descent using a functor as
well. This added extra conditions on the partial derivatives of the loss function:
it needed to be invertible in the 2nd variable. This constraint was not justified
in [23], nor is it a constraint that appears in machine learning practice. This led
us to reexamine their constructions, coming up with our reformulation that does
not require it. While loss maps and optimisers are mentioned in [23] as parts of
the aforementioned functor, here they are extracted out and play a key role: loss
maps are parametric lenses and optimisers are reparameterisations. Thus, in this
paper we instead use Para-composition to add the loss map to the model, and
Para 2-cells to add optimisers. The mentioned inverse of the partial derivative
of the loss map in the 2nd variable was also hypothesised to be relevant to deep
dreaming. We have investigated this possibility thoroughly in our paper, showing
24 Cruttwell, Gavranović, Ghani, Wilson, and Zanasi
References
THE SANCTIFIED.
‘Unto the church of God which is at Corinth, to them that are
sanctified in Christ Jesus, called to be saints, with all that in
every place call upon the name of Jesus Christ our Lord, both
their’s and our’s.’—1 Cor. i. 2.
I hope it has been made clear that the original meaning of the word
‘to sanctify,’ was ‘to set apart as a holy thing unto God,’ and that the
Levitical meaning of sanctification through blood was the cleansing
from all legal impurity. It is obvious that both these divine acts
clearly involve personal holiness.
That which is set apart by the call of God, and cleansed from all
guilt, should clearly be kept holy and undefiled from the pollution
both of the heart and of the world. Such persons should be like the
vessels of the sanctuary: ‘sanctified and meet for the Master’s use.’
This leads us to that which is the ordinary meaning of the term in
religious books: viz. personal holiness. By personal holiness is
meant the sacred work of God the Holy Ghost within the soul; the
reflection of the character of our Blessed Lord; the law put into the
inward part, and written on the heart by the Spirit of God. This is
the meaning of such texts as 1 Thess. v. 23: ‘The very God of peace
sanctify you wholly; and I pray God your whole spirit and soul and
body be preserved blameless unto the coming of our Lord Jesus
Christ.’ The separation is a past act, for if we are in Christ Jesus, we
have been already separated unto God; but this is an abiding
condition, for real holiness is a present matter of daily life. Now
both these parts of sanctification are brought out in the words which
I have read as our text, for it is addressed ‘to them that are,’ or have
been, ‘sanctified in Christ Jesus, called to be saints.’ The first clause
refers to the past act, and represents those believers as having been
set apart unto God, or separated as a peculiar people unto Himself;
the second describes their present condition as inseparable from
their high calling; for, having been set apart, they are now called to
be the saints of God. I need not stop to point out that the word
‘saints,’ is not limited in Sacred Scripture to those who are in
heaven. Still less has it to do with the canonization of Rome, or the
seclusion of a monastery. It describes the personal holiness of the
Christian in common life, the habitual character of the man walking
with God; as in those words of the Apostle Peter (1 Pet. i. 15): ‘Be
ye holy;’ or be ye saints, ‘in all manner of conversation.’ So that our
text means the same as if it had been written, ‘To them that have
been set apart unto God in Christ Jesus their Saviour, and who, as
the result of that sacred call, are now leading holy lives in His
presence.’
My object this morning will be simply to trace the connexion
between these two steps of God’s sacred work: the past separation
unto Him, and the present personal holiness of character. And all I
can say is, may the Lord help us in our own experience to
understand both of them, and then we shall have no difficulty in
perceiving how they are bound together in the work of Salvation!
We shall find them connected by a principle and a power.
I. A Principle. It is perfectly clear, as a matter of principle, that
there ought to be holiness in all that is consecrated to God. He
consecrated, or set apart unto Himself, the Sabbath day, and so He
says, ‘Remember the Sabbath day, to keep it holy.’ The Temple, like
our own churches, was consecrated to God, and therefore we read,
‘Holiness becometh Thine house for ever.’ The vessels of the Temple
were dedicated, or sanctified to His service, and therefore they
should not be touched by unhallowed hands, and the words of
sacred Scripture are, ‘Be ye clean, that bear the vessels of the Lord.’
On these principles we none like to see a neglected church, a
dishonoured Bible, or a careless attitude, in the house of God. On
the same principles, we should all be profoundly humbled when
unhallowed thoughts,—thoughts of the world, thoughts of vanity, of
jealousy, or of self,—in any shape intrude into holy things, and
corrupt those sacred hours which are set apart exclusively to God.
But if this applies to consecration generally, how pre-eminently does
it apply to such a consecration as that described in the text, in which
we are said to be ‘sanctified in Christ Jesus.’ That sanctification is
the introduction of the ruined sinner into a covenant union with the
Son of God. If you have been thus sanctified in Christ Jesus, you no
longer stand alone, to bear your own burden, or plead your own
cause. You have been separated from the ruined world, and
identified with the Lord Jesus; so that He represented you in bearing
your curse when He suffered, and He now represents you at the
right hand of the throne of God, while He pleads on your behalf.
Thus you are cleansed from all legal guilt. You are sanctified by
blood, and charged with no defilement. By the eternal covenant of
God He is become your Head, so that in His death you died; in His
life you live; in His acceptance you are justified; and in His glory you
are glorified.
But, if you are thus separated unto such a union with Him, is there
any room left for one moment’s doubt as to what ought to be your
character? If you are set apart by Him into this covenant union with
Himself, you are set apart into a oneness of mind, of will, and of
interest. He represents you in heaven, and you represent Him on
earth; as God sees you in Him, so the world sees Him in you. You
bear His name; you are sealed with the seal of the covenant; you
are made a peculiar people unto the Son of God: and I am sure we
must all see the justice of those words of the Apostle Peter, ‘As He
who hath called you is holy, so be ye holy in all manner of
conversation.’
II. Power. But here lies the difficulty. You really wish to be holy, but
you are not; you have endeavoured to overcome your temper, but it
is still there; you have striven against wandering thoughts in prayer,
but they still interfere most mournfully with your most sacred acts of
worship; you wish, and you mean to be, unselfish, but you find
selfishness continually cropping up, to your sorrow and vexation of
spirit. Now, the question is,—How is this to be overcome? Your
resolves will not do it, for you have made hundreds, and failed in
them all; and no man on earth can do it for you, for the evil lies far
too deep for the reach of man. What then is to be done? We may
turn back to our text, and there learn the secret, for in those words,
‘In Christ Jesus,’ we are taught the power.
We learned, in studying the first act of separation, that it was the
work of God the Holy Ghost: here it is said to be in Christ Jesus.
Some people dwell more on the distinction than I am myself
disposed to do: there is such a perfect oneness in the infinite God,
that I confess I have but little heart for these refined distinctions. As
the Father and the Son are one, so God the Son and God the Holy
Ghost are one; and when the Lord Jesus saves the sinner, it is the
Holy Ghost that applies that salvation to the soul. Without stopping,
therefore, to study any such distinction, let us rather hasten to the
practical lesson that sanctification, or consecration, here described,
is a sanctification in Jesus Christ. You may look thus to your
covenant union with Him, and trust Him by the in-breathing of His
own Spirit to make you holy. You may remember that He came to
save you from your sins, and not merely from their curse; and that
holiness is just as much a gift of the covenant as pardon.
You remember those words, 1 Cor. i. 30: ‘But of Him are ye in Christ
Jesus, who of God is made unto us wisdom, and righteousness, and
sanctification, and redemption.’ They teach us that the Lord Jesus is
the source of all practical wisdom and holiness, just as much as of
imputed righteousness and redemption. The passage is clearly not
speaking of an imputed wisdom, and we have no right to apply it to
an imputed sanctification. It refers to the practical wisdom and
personal holiness of the man who by God’s grace is wise and holy;
and teaches that both one and the other are found exclusively in
Christ Jesus.
You may, therefore, trust the Lord Jesus Christ for your sanctification
just as much as for your justification; for your personal holiness in
daily life, as well as for your safety in the great day of judgment.
Look carefully then unto your covenant union with Christ, and think
on Him as your covenant Head: then spread out all your difficulties
and temptations before Him as your Head. Acknowledge before Him
how you have dishonoured His headship by your evil thoughts, your
evil words, and your constant failures; and trust Him, as your Head,
to form in you His image, and by His own most Holy Spirit to give
you the victory. Do not stand at a distance, thinking it your duty to
doubt your union, for by so doing you will never overcome. Without
that union you will never know what victory means. I am assuming
now that in the secret of your own souls you have been verily
engrafted into Christ. I know you were sacramentally in baptism,
but I am looking deeper than that, for many who are baptized are
never saved, and I am speaking now of the real saving union of your
soul with Christ Himself. Now, if that is yours through His wonderful
grace, accept it, act upon it, trust Him as your living Head; and you
will find, as time goes on, that though you cannot overcome, He
can; and that He will finally present you, ‘holy and unblameable
before Him at His coming.’ But here we are brought to the old
difficulty,—that you have no real evidence in your soul of the
existence of such a union with Christ. You know there is your
baptismal union, but still you cannot feel safe, and you greatly doubt
whether you are amongst those who have been ‘Sanctified in Christ
Jesus: called to be saints.’ This is the reason why many of you
cannot come to the sacred feast of the Supper of the Lord, and why
many others, who do come, come with a heavy heart. Would to God
we might see those absent ones brought near, and those heavy
hearts gladdened by the Lord! But in order to that you must grapple
at once with the great question of your own personal salvation,—
your separation unto God. Till that is settled you will be powerless
against yourself. Till you are in Christ, and sanctified in Christ Jesus,
you will never be sanctified at all. If you really desire to be really
holy, for the sake of that holiness begin at the beginning, and never
rest till you are safe. Your safety must come before your holiness, or
you will wait for it for ever. Begin therefore with the prayer, ‘Lord,
save me: I perish.’ Throw yourself into His hand for pardon, for
acceptance, for life. Never rest till you can appropriate the language
of St. Paul: ‘Who hath saved me, and called me with an holy calling.’
And, when that is given, you may go on to those other words of the
same Apostle, and say, ‘According as He hath chosen us in Him
before the foundation of the world, that we should be holy and
without blame before Him in love.’
PROGRESS.
‘But we all, with open face beholding as in a glass the glory of
the Lord, are changed into the same image from glory to glory,
even as by the Spirit of the Lord.’—2 Cor. iii. 18.
I. The Standard. How many a noble ship has been lost through
some inaccuracy in the compass! If the compass points too much to
the east or to the west, the most careful commander may wreck his
vessel. And if the compass of the soul is in a wrong direction you
will find it very hard to walk in a right path. Now, in this text the
one standard is the image of the Lord Jesus. We are said to behold
the glory of the Lord, and to be changed into His image. There
cannot be a doubt that this refers to the Lord Jesus Christ, and that
by His glory is meant His grace. If there were, it would be settled by
these words in John, i. 14: ‘The Word was made flesh, and dwelt
among us; and we beheld His glory, the glory as of the only-
begotten of the Father, full of grace and truth.’ The great
manifestation of the glory of God is in the grace and truth of the
incarnate Word. If, then, we would be holy, as God is holy, we must
be changed into the image of the Lord Jesus Christ. When we are
like Him we shall be holy, harmless, undefiled, and separate from
sinners; but not before. So you will find that, when persons speak
about their sinlessness, it may generally be traced to their adopting
a low standard of holiness. Sometimes people will set up their own
experience as a standard, and really seem to think that we are to
receive their accounts of their own experience as if it were another
Bible. Sometimes we read of a perfection, not absolute, but ‘up to
the measure of to-day’s consciousness.’ Accordingly I have read of
one described as an eminent Christian, who ‘said that a few days
more would make twenty-one years that his obedience had been
kept at the extreme verge of his light.’ I am not sure that I know
what the writer means, and I may possibly misunderstand his words;
but, if they mean what they seem to mean, I can scarcely imagine
anything more delusive. We know very well how the eyes may be
blinded, the heart deadened, and the conscience seared by sin; we
know that the deeper a man is sunk in sin the less he feels it, and
the lower his fall the more profound his want of feeling; and only
imagine what must be the result if a deadened, thickened, darkened
conscience were to be accepted as the measure of a sinless life.
The idea reminds one of these words of St. Paul, 2 Cor. x. 12: ‘They
measuring themselves by themselves, and comparing themselves
among themselves, are not wise.’ Nay: they go further, and point to
the tremendous danger pointed out by our Lord Himself: ‘If the light
that is in thee be darkness, how great is that darkness.’ (Matt. vi.
23.) No: we must have a standard rising high above either
consciousness, conscience, or our own light; a standard that never
varies; a standard that does not go up and down with our changes
of feeling or opinion; a standard as unchanging as the perfect
character of God Himself! This is the standard which we find in the
Lord Jesus Christ,—in God manifest in the flesh: and what is more,
thanks be to God, this is the standard which we shall one day
reach! For, though there are many things still hidden, there is one
thing we most assuredly know, and that is, that ‘when He shall
appear we shall be like Him, for we shall see Him as He is.’
II. The Progress. As I have just said, when we see the Lord as He is
we shall be like Him; and when that comes to pass we shall see the
perfection of the promise, He ‘shall save His people from their sins.’
He will so completely save them that whereas He finds them corrupt,
ruined, and enemies to God by wicked works, He will finally present
them holy and unblameable, without spot and without blemish
before the throne. It is impossible to imagine anything more
blessed, more wonderful, more divine, than such a change. Now it
becomes a question of the deepest interest whether this mighty
change is accomplished by one instantaneous act, or gradually. Is it,
like justification, a completed thing? or is it a progressive work,
commenced at the new birth, but not complete till we see Him as He
is? There cannot be a more important practical inquiry. And now
you may see the importance of the distinction drawn between the
different senses of the word ‘sanctification;’ or, as it might be better
expressed, the different parts of that blessed work. If you speak of
sanctification as the original act of God in separating us unto
Himself, then it is a completed thing, for we are described as ‘having
been sanctified in Christ Jesus.’ If, again, you speak of it as a legal
cleansing from all past guilt, it is complete, for being washed in the
precious blood we are already clean. But if you regard it as the
personal holiness of daily life, the purifying the heart through faith
by the indwelling power of the Holy Ghost, then I am prepared to
maintain from the whole testimony of the whole Word of God from
one end to the other, that so long as we are in this world of conflict
the sacred work is not complete, but progressive. How people can
speak of sanctification in this sense as an immediate work, I am at a
loss to understand. Hundreds of passages might be quoted to prove
its progressive character, and to show the reason of its present
incompleteness: viz., the abiding power of indwelling sin. I have
only time to refer to two. In the first place, this verse describes us
as changed, or being changed, from glory to glory. We are
described as in the process of transformation, or metamorphosis; by
His grace passing from glory to glory, or from one degree of grace to
another. The work is in progress, thanks be to God! and we have
the bright hope of the completed likeness of the Lord. But that
bright hope is not yet realized, nor will it be till we see Him as He is.
I will take only one other passage, and select it because it
corresponds very closely to the text. It is a passage addressed to
the believers in Rome,—to persons who are described as being
‘beloved of God called to be saints.’ (Rom. i. 7.) There can be no
doubt then that the work of personal holiness was begun in them:
yet what does St. Paul say to them? (Rom. xii. 2.) ‘Be not
conformed to this world: but be ye transformed (or
metamorphosized) by the renewing of your mind, that ye may prove
what is that good, and acceptable, and perfect will of God.’ Is it not
clear then that those persons who were beloved of God, and called
to be saints, were still to be reaching forth after higher attainments?
There was so much evil in them that they still required to be warned
against conformity to the world, and so far were they from their high
standard, that they required nothing less than a transformation or
metamorphosis (it is the same word as in the text), in order to bring
them into a personal experience ‘of the good, and acceptable, and
perfect will of God.’ Be sure then there is no resting-place in
Christian holiness for the saints of God. The Lord may have done
great things for us, whereof we are glad. He may have given us
such an insight into His grace that we now love that which we once
cared nothing for, and hate that which we once loved: He may have
led us to say from the bottom of our hearts, ‘I delight to do Thy will,
O my God.’ But our motto must still be, ‘Forgetting those things that
are behind, and reaching forth unto those things that are before, I
press towards the mark for the prize of the high calling of God in
Christ Jesus.’ The more we love Christ, the more must we be deeply
humbled that we love Him so little; and the more we look at the
blessed prospect of a real and perfect sinlessness, the more must we
be ready to say, as St. Paul did, ‘Not as though I had already
attained, either were already perfect: but I follow after, if that I may
apprehend that for which also I am apprehended of Christ Jesus.’
III. But some of you will be ready to say that that is just where your
difficulty lies. You do really desire to be going forward, and to be
making progress, but it seems as if you could not. You are like a
person in a nightmare, who wishes to run, but cannot. Let us then
consider what is God’s great instrument, whereby He imparts
progress to the soul. On this subject this text is quite decisive, for it
shows that God’s great instrument is the view of the Lord Jesus
Christ through faith. In the passage to which I have already referred
in 1 John, iii. 2, we find that the perfect view of the Lord Jesus will
lead to perfect likeness, so in these words the partial view leads to
progressive likeness. When the view is perfect the likeness will be
perfect too; now that the view is imperfect, only as through a glass,
the likeness is imperfect likewise. But still it is growing more and
more; for ‘we all,’—not merely special Christians, who have attained
what they call ‘the higher life,’—‘beholding as in a glass the glory of
the Lord, are changed into the same image from glory to glory.’
Now I believe it is impossible to press this too strongly on all those
who desire holiness, for there is a perpetual tendency in every one
of us to turn the eye inward on ourselves, instead of keeping it fixed
on Him. Some are occupied with what they feel, or do not feel, or
wish to feel, or wish they did not feel; and some by what they do, or
mean to do, or think they ought to do,—till the whole mind becomes
bewildered, and the whole soul entangled. Remember that you may
be entangled by your religious efforts, as well as by your sins:
nothing indeed entangles people more than confused and mistaken
religion. So that if you really want to be like Him, you must sweep
away all your entanglements like so many cobwebs, and, just as you
are, look straight at Him. For example, you say you do not feel sin,
and you do not feel anything like the sorrow for it that you know you
ought to do. I have no doubt you are perfectly right, and it is very
sorrowful, very sinful, and very sad. But how is it to be overcome?
I know of only one way, and that is a very simple one, too simple for
many of you,—and that is a look: you must behold Christ.
Remember the case of the Jews. Nothing yet has melted their
hearts: their great national afflictions have utterly failed: but in God’s
time there will be a change. We shall see those people mourning,—
so mourning that they will be led with broken hearts to the Fountain
open for sin and for uncleanness. And what will be God’s instrument
for producing such a change? By what means will He effect it? By a
look: a simple look! You find it described in Zech. xii. 10: ‘They shall
look on Me whom they have pierced, and shall mourn for him.’ That
one look will accomplish more than 1800 years of bitter, and most
afflictive, discipline. And it is just the same with ourselves. One
look at our loving and living Saviour will do more towards softening
the heart than hours spent in the scrutiny of feeling, or whole books
of self-examination. If you want to grow in grace, in tenderness of
conscience, in holy abhorrence of sin, in purity of heart, in lowliness
of spirit, and in thankful love for your blessed Saviour,—then look on
Him, keep your eyes on Him. Think on His Cross, how He died for
you; on His life, how He lived for you; on His advocacy, how He
pleads for you; on His perfect character, His love, His holiness, His
purity, His power, His grace, His truth: for by such a look, and such a
look alone, can you ever hope to be changed into His image.
But remember one expression in the text: viz. those three words,
‘With open face.’ The look that transforms is a look with an open
face: there must be nothing between. There must not be a veil over
it, as there is over the Jews, as you read in verse 15. Every barrier
must be removed. The great barrier of the curse is gone, through
the blood of atonement; and we must not now set up fresh barriers
of our own creation. We must remember the hymn:—
INFECTION OF NATURE.
‘I thank God through Jesus Christ our Lord. So then with the
mind I myself serve the law of God; but with the flesh the law
of sin.’—Rom. vii. 25. [64]
There are few passages in the whole word of God that have excited a
deeper interest amongst truly Christian people than the latter part of
the seventh of Romans. It is so closely connected with the practical
experience of Christian life, and at the same time it is so much
opposed to the beautiful theories of some Christian people, that it
has always excited an earnest spirit of inquiry, and engaged the
deepest interests of the students of Scripture. I propose to make it
the subject of our study this morning: to endeavour to find out what
the Scripture really teaches. And in the outset of our study I should
wish to give one caution, which I believe to be of the utmost
importance for us all: viz., we must not bring the sacred Scriptures
to the test of our theories, but must be prepared, if need be, to give
up those theories to the authority of Scripture. If we want to live in
God’s truth, we must be subject to God’s Word, and must be content
to receive what He teaches as He teaches it. In other words, we
must not twist Scripture so as to make it fit our own opinions, but
must receive it as from God, and make all our opinions bend before
its high authority.
With this caution before us, there are three subjects to be
considered. First, of whom is the Apostle speaking: of himself, or
some other man? Secondly, of what period in his Christian life is he
speaking: does he refer to the past or to the present? And thirdly,
what does the passage teach respecting his spiritual condition at the
time he wrote the words?
And now, may God the Holy Ghost, who inspired the Word, lead us
all reverently to study, and rightly to understand, His teaching!
I. To whom does it refer? I feel persuaded that we shall all admit
that, if any person were to read the chapter without having some
previous opinion to support, he would believe that the Apostle was
speaking of himself. The word ‘I’ occurs no less than twenty-eight
times in the passage. Such expressions as ‘I do,’ ‘I consent,’ ‘I allow,’
‘I delight,’ are found continually; and certainly the natural conclusion
would be that when he said, ‘I,’ he meant himself. I know that it is
sometimes said that he personated some other person; a legalist, or
one in a lower Christian life. But there is not the least evidence of
any such personation in the passage, and he says not one word to
lead us to suppose that such was his intention. In iii. 5, he does
thus personate an objector; and says, ‘But if our unrighteousness
commend,’ &c. But then he distinctly states that he is doing so in
the words, ‘I speak as a man.’ But there is nothing of the kind here.
There is a plain, simple statement in his own name; passing from
the ‘I,’ which pervades the chapter, to ‘I, myself,’ in the last verse;
and I am utterly at a loss to understand on what principle these
plain words, ‘I myself,’ can be supposed to express the personation
of some other man.
II. In answer then to our first question, I am brought to the
conclusion that when he spoke of ‘I myself,’ he meant himself, and
not another: and we may pass to our second question. To what
period of his spiritual history did he refer: did he speak of the past,
or of the present? Was he describing some period of past anxiety
out of which he had been delivered, so as to enter on the joys of the
eighth chapter? Or was he speaking of his state of mind at the very
time, that he was writing the eighth chapter, and declaring, ‘Ye have
not received the spirit of bondage again to fear; but ye have
received the spirit of adoption, whereby we cry, Abba, Father?’ In
answer to this I have not the smallest hesitation in saying that,
according to every principle of sound exposition, the seventh chapter
refers to exactly the same period as the eighth; that it is a
description of his own experience at the time he wrote the words;
and that we should have just as much authority for saying that the
eighth chapter referred only to the future, as that the seventh
referred only to the past. For this I give three reasons:—
(1.) If we wish to understand the Word of God we must receive
plain words as we find them in sacred Scripture. We have no right
to assume that the present tense stands for the past; that ‘I am,’
means ‘I was;’ that ‘I do,’ means ‘I used to do;’ that ‘I hate,’ means
‘I used to hate;’ and ‘I delight,’ means ‘I used to delight.’ If we once
begin thus to handle Scripture there is an end to exposition; and if
people who thus twist Scripture would be consistent, they ought to
go on, and say that the beginning of this verse, ‘I thank God through
Jesus Christ our Lord,’ means, ‘I used to thank Him, but I do not
now.’
(2.) Again, the transition from the past to the present is clearly
marked in the passage. In the parenthesis which extends from the
seventh verse to the end of the chapter, we find the three tenses,—
past, present, and future. From verse seven to verse thirteen it is all
in the past, and is a description of a certain portion of his past life.
‘I was alive;’ ‘the commandment came;’ ‘sin revived;’ ‘I died;’ ‘I
found it to be unto death;’ and ‘sin deceived me, and by it slew me.’