In the previous piece we redesigned the store
interface for the backend in
hope to have a better DynamoDB implementation to it. After the changes we have
to provide a Load
method that returns our Session
object with the current
version of it, and a Save
method that will get the ID, the modified Session
object and the (assumably) current version that matches the one we still have in
the database; if the provided version does not match the current version, the
method should return an ErrVersionMismatch
error.
Originally I was planning to have the content of this post together with the previous one but while in the making I realized it was getting too long for a single post. On the flip side, now that the post was splitted in two, I have the chance to show some things in more details that I originally planned 🎉!
This item is the sixth one of a series called Pajthy to AWS where I try to capture the process of migrating one of my open-source pet projects into a serverless setup in AWS.
About DynamoDB
DynamoDB is a key-value database provided by AWS, with all the glitter we could expect from the go-to NoSQL solution of one of the largest cloud service providers: it’s fast, flexible, can work as a multi-region database and these are just some of the qualities that Amazon uses to describe their service ðŸ¤.
The value part of an entry could be practically anything we could write into a JSON: other than the common primitives it supports arrays and maps as well. It’s important to note that it’s not storing JSON objects, though it is pretty similar to it; like a JSON with type definitions.
We can define a schema for our data beforehand, but we can extend an entry with different attributes later without any schema changes if we want. The key is part of the data, so at least that is what we have to define when we create the table.
Condition expressions
In the Optimistic locking post we saw that for optimistic locking to work we need an atomic operation that can check a given condition and then either save the data or return with an error regarding the result of that condition. DynamoDB supports condition expressions, we can utilize for our use.
The condition in our case is: when there is no version supplied with the session
object to Save
, we want to have the attribute_not_exists(SessionID)
expression. If we have the version then what we will check is
version = :last_known
; by having these expressions the PutItem
operation will fail with a ConditionalCheckFailedException
: that’s something
we can look for in our responses.
The go SDK
AWS maintains an official SDK for go, and in it has some nice helpers that we can use to make our life easier; for example an (entity mapper)[https://pkg.go.dev/github.com/aws/aws-sdk-go-v2/feature/dynamodb/attributevalue] that will do the marshaling between the DynamoDB syntax and go structs.
If you look for code examples for the go AWS SDK on the web then always check the SDK version; v1 and v2 are pretty similar, but it can be a pain to “normalize” the code of mixed versions if you build your proof of concept out of snippets from different sources.
Store implementation
As I mentioned earlier, the data we push into Dynamo needs to include the key.
For achieving it we can create a composite struct from the current
store.Session
item and the key itself:
|
|
From now on, when we save or load items in DynamoDB, we’ll doing so by using
these dynamoItem
objects.
To produce the condition expression we have the following method:
|
|
Based on the version
parameter of the method:
- if there was no version specified it means the item we want to put is a new
one (otherwise we’d provide a version 🧠), so in line 124 we return
a condition that fails if the
SessionID
attribute not exists in the item; since this attribute is a mandatory one (it’s the key after all) the lack of it must mean that there is no session either. - if we had a version we return the version equality check in line 127.
The actual Save
looks like this now:
|
|
First, starting at line 89 we have our dynamoItem
struct - with a new
random version - marshaled into the DynamoDB format; we could’ve done this by
hand as well, in that case we have to define the data types for all of the
attributes, then convert all non-pointer variables to pointers, since that’s
what the SDK expects. Better use the the MarshalMap
if you ask me 😅.
In line 94 there is another helper for building the condition: we’ll have to provide attribute names and values separately with out condition; by using this builder we can just throw conditions in and it will spit out the values we need in the call in lines 104-106.
|
|
Based on the error PutItem
returned we either
- return
ErrVersionMismatch
at line 113 if the conditional check was what failed - return the error as is if was some other error
- or return
nil
if the operation was successful.
Testing
When testing edge layers (like store
), where the component directly
communicates with an external service, I like to test with with an actual
instance of that service instead of mocking it. In our case we could have
a user in a test account and test the behavior directly on a DynamoDB
table running in AWS; however this would make it difficult for others to easily
test the changes (I won’t share my AWS credentials for sure) and don’t forget
about the financial aspect either: we are paying for executed operations here
after all.
Luckily Amazon provides a way to run a DynamoDB instance locally in a Docker
container (for testing purposes), and with a container we are golden; we can
spin up an instance easily with testcontainers
:
|
|
We initialize the test by starting the testcontainer
; in lines 27-36 we
define the name of the image (using the latest version, we’ll use that in AWS as
well after all), the ports we want to expose and a condition: execution will
wait until the condition gets accomplished. Once the container is up and running
(and it’s 8000/tcp is available) we extract the host and port from the
container: based on where we run our test (on our machine? in a docker
container? in a docker on another machine?) and to which port testcontainer
map the service these values can vary by each execution.
We have to tell the SDK as well that it should not go out to the cloud for the test table. In line 43-52 we define a new endpoint resolver: if the service the SDK wants to go is a DynamoDB it should go directly to our container.
Now that we have the infrastructure and the configuration set up, let’s see the table we’ll run our tests in:
|
|
With these in place we can run the store
test suite against the DynamoDB
implementation.
Finishing touches
The only thing remains is to wire in DynamoDB
store instead of the InMemory
one into the Lambda:
|
|
The LoadDefaultConfig
method is an awesome help for dealing with the configuration. By default it
will look for authentication parameters in environment variables (these are set
when running in Lambda). If it did not find those, it will fall back to the
~/.aws/config
and ~/.aws/credentials
files; that could come handy when
running the code on the machine you use aws cli
too.
Build it, deploy it, see it working!
First impressions
After some testing on AWS I noticed that the first execution for a fresh Lambda container now could take up to one - one and a half second. This delay could be an issue with a “real time” service like Pajthy is. For now I let it be: our goal in this series is to migrate the service, not to make it perfect; after all we can always configure some provisioned concurrency to have enough containers always running that would deal with the majority of the load. With a running container Lambda execution times usually stay under 100ms.
Conclusion
With this we are finished with the store layer! In this post we went through
- what is DynamoDB in a nutshell
- how does DynamoDB provide the functions we need for our Optimistic locking setup
- how to access these functions with the go AWS SDK v2 and what additional goodies the library have to make our life easier
- how to test DynamoDB with a local docker container.
In the next post we’ll find a way to deal with the WebSocket events that make pajthy the interactive wonder that makes it actually useful!