The model of declarative eventing allows for listening to very specific events and then triggering specific actions. This model simplifies the developer experience, as well as optimizing the system by reducing network traffic.

AWS S3 bucket trigger

In looking AWS to explain changes in S3 can trigger Lambda functions, I found that the AWS product docs focus on the GUI configuration experience. This probably makes it easy for new folks to write a specific Lambda function; however, it a little harder to see the system patterns before gaining a lot of hands-on experience.

The trigger-action association can be seen more clearly in a Terraform configuration. Under the hood, Teraform must be using AWS APIs for setting up the trigger). The configuration below specifies that whenever a json file is uploaded to a specific bucket with the path prefix “content-packages” then a specific Lambda function will be executed:

resource "aws_s3_bucket_notification" "bucket_terraform_notification" {
    bucket = "${aws_s3_bucket.terraform_bucket.id}"
    lambda_function {
        lambda_function_arn = "${aws_lambda_function.terraform_func.arn}"
        events = ["s3:ObjectCreated:*"]
        filter_prefix = "content-packages/"
        filter_suffix = ".json"
    }
}

— via justinsoliz’ github gist

Google Cloud events

To illustrate an alternate developer experience, the examples below are shown with Firebase JavaScript SDK for Google Cloud Functions, which is idiomatic for JavaScript developers using the Fluent API style, popularized by jQuery. The same functionality is available via command line options using gcloud, the Google Cloud CLI.

** Cloud Storage trigger**

Below is an example of specifying a trigger for a change to a Google Cloud Storage object in a specific bucket:

exports.generateThumbnail = functions.storage.bucket('my-bucket').object().onChange((event) => {
  // ...
});

Cloud Firestore trigger

This approach to filtering events at their source is very powerful when applied to database operations, where a developer can listen to a specific database path, such as with Cloud Firestore events:

exports.createProduct = functions.firestore
  .document('products/{productId}')
  .onCreate(event => {
    // Get an object representing the document
    // e.g. {'name': 'Wooden Doll', 'description': '...}
    var newValue = event.data.data();

    // access a particular field as you would any JS property
    var name = newValue.name;

    // perform desired operations ...
});

An emerging pattern in server-side event-driven programming formalizes the data that might be generated by an event source, then a consumer of that event source registers for very specific events.

A declarative eventing system establishes a contract between the producer (event source) and consumer (a specific action) and allows for binding a source and action without modifying either.

Comparing this to how traditional APIs are constructed, we can think of it as a kind of reverse query — we reverse the direction of typical request-response by registering a query and then getting called back every time there’s a new answer. This new model establishes a specific operational contract for registering these queries that are commonly called event triggers.

This pattern requires a transport for event delivery. While systems typically support HTTP and RPC mechanisms for local events which might be connected point-to-point in a mesh network, they also often connect to messaging or streaming data systems, like Apache Kafka, RabbitMQ, as well as proprietary offerings.

This declarative eventing pattern can be seen in a number of serverless platforms, and is typically coupled with Functions-as-a-Service offerings, such as AWS Lambda and Google Cloud Functions.

An old pattern applied in a new way

Binding events to actions is nothing new. We have seen this pattern in various GUI programming environment for decades, and on the server-side in many Services Oriented Architecture (SOA) frameworks. What’s new is that we’re seeing server-side code that can be connected to managed services in a way that is almost as simple to set up as an onClick handler in HyperCard. However, the problems that we can solve with this pattern are today’s challenges of integrating data from disparate systems, often at high volume, along with custom analysis, business logic, machine learning and human interaction.

Distributed systems programming is no longer solely the domain of specialized systems engineers who create infrastructure, most applications we use every day integrate data sources from multiple systems across many providers. Distributed systems programming has become ubiquitous, providing an opportunity for interoperable systems at a much higher level.

Some people have the privilege to be recognized in our society. We’ve started to resurrect history and tell stories of people whose contributions have been studiously omitted. In America, February is Black history month. I didn’t learn Black history in school. I value this time for remedial studies, even as I feel a bit disturbed that we need to aggregate people by race to notice their impact. I had hoped that we, as a society, would have come farther along by now, in treating each other with fairness and respect. Instead, we are encoding our bias about what is noticed and who is recognized.

At the M.I.T. Media Lab, researcher Joy Buolamwini has studied facial recognition software, finding error rates increased with darker skin (via NYT). Specifically, algorithms by Microsoft, IBM and Face++ more frequently failed to identify the gender of black women than white men.

When the person in the photo is a white man, the software is right 99 percent of the time.
But the darker the skin, the more errors arise — up to nearly 35 percent for images of darker skinned women

A lack of judgement in choosing a data set is cast as an error of omission, a small lapse in attention on the part of software developers, yet the persistence of these kinds of errors illustrates a systemic bias. The systems that we build (ones made of code and others made of people) lack checks and balances where we actively notice whether our peers and our software are exercising good judgement, which includes treating people fairly and with respect.

Errors made by humans are amplified by the software we create.

Google "knowledge card" for Bessie Blount Griffen, shows same photo for Marie Van Brittan Brown and Miriam Benjamin

The Google “knowledge card” that appears next to the search results for “Bessie Blount Griffen” shows “people also searched for” two other women who are identified with the same photo. It’s hard to tell when the error first appeared, but we can guess that it was amplified by Google search results and perhaps by image search.

I discovered this error first when reading web articles about two different inventors and noticed that the photos used many of the articles were identical. This can be seen clearly in two examples below where the photo is composited with an image of the corresponding patent drawings. The patents were awarded to two unique humans, but somehow we, collectively, blur their individual identities, anonymizing them with a singular black female face.

I found a New York Times article of Marie Van Brittan Brown, and it seems that the oft replicated photo is Bessie Blount Griffen.

Additional confirmation by @SamMaggs tweet, author of “Wonder Women: 25 Innovators, Inventors, and Trailblazers Who Changed History”

photo of black women and patent drawing of medical apparatus
source: Black Then

patent drawing with home security system, with text: Marie Van Brittan Brown invented First Home Security System in 1966
source: Circle City Alarm blog

Newspaper article with photo of black woman and man behind her, caption: Mr and Mrs Albert L Brown