CopyPastor

Detecting plagiarism made easy.

Score: 1; Reported for: Exact paragraph match Open both answers

Possible Plagiarism

Reposted on 2019-09-26
by madhead

Original Post

Original - Posted on 2018-12-02
by madhead



            
Present in both answers; Present only in the new answer; Present only in the old answer;

In August 2018 [Amazon announced](https://aws.amazon.com/about-aws/whats-new/2018/08/use-amazon-dynamodb-local-more-easily-with-the-new-docker-image/) new [Docker image](https://hub.docker.com/r/amazon/dynamodb-local/) with Amazon DynamoDB Local onboard. It does not require downloading and running any JARs as well as adding using third-party OS-specific binaries like `sqlite4java`.
It is as simple as starting a Docker container before the tests:
docker run -p 8000:8000 amazon/dynamodb-local
You can do that manually for local development, as described above, or use it in your CI pipeline. Many CI services provide an ability to start additional containers during the pipeline that can provide dependencies for your tests. Here is an example for Gitlab CI/CD:
test: stage: test image: openjdk:8-alpine services: - name: amazon/dynamodb-local alias: dynamodb-local script: - ./gradlew clean test
So, during the `test` task DynamoDB will be available on `http://dynamodb-local:8000`.
Another, more powerful tool is [localstack](https://github.com/localstack/localstack). It supports two dozen of AWS services, DynamoDB is one of them. The isage is very similar, you have to start it before running the tests and it will expose AWS-compatible APIs on [given ports](https://github.com/localstack/localstack#overview):
test: stage: test image: openjdk:8-alpine services: - name: localstack/localstack alias: localstack script: - ./gradlew clean test
The idea is to move all the configuration out of your build tool and tests and provide the dependency externally. Think of it as of dependency injection / IoC but for the whole service, not just a single bean. This way, you code is [more clean and maintainable](https://12factor.net/backing-services). You can see that even in the examples above: you can switch mock implementation from DynamoDB Local to localstack by simply changing the `image` part!
In August 2018 [Amazon announced](https://aws.amazon.com/about-aws/whats-new/2018/08/use-amazon-dynamodb-local-more-easily-with-the-new-docker-image/) new [Docker image](https://hub.docker.com/r/amazon/dynamodb-local/) with Amazon DynamoDB Local onboard. It does not require downloading and running any JARs as well as adding using third-party OS-specific binaries (I'm talking about `sqlite4java`).
It is as simple as starting a Docker container before the tests:
docker run -p 8000:8000 amazon/dynamodb-local
You can do that manually for local development, as described above, or use it in your CI pipeline. Many CI services provide an ability to start additional containers during the pipeline that can provide dependencies for your tests. Here is an example for Gitlab CI/CD:
test: stage: test image: openjdk:8-alpine services: - name: amazon/dynamodb-local alias: dynamodb-local script: - DYNAMODB_LOCAL_URL=http://dynamodb-local:8000 ./gradlew clean test
Or Bitbucket Pipelines:
definitions: services: dynamodb-local: image: amazon/dynamodb-local step: name: test image: name: openjdk:8-alpine services: - dynamodb-local script: - DYNAMODB_LOCAL_URL=http://localhost:8000 ./gradlew clean test
And so on. The idea is to move all the configuration you can see in [other](https://stackoverflow.com/a/37780083/750510) [answers](https://stackoverflow.com/a/39086207/750510) out of your build tool and provide the dependency externally. Think of it as of dependency injection / IoC but for the whole service, not just a single bean.
After you've started the container you can create a client pointing to it:
private AmazonDynamoDB createAmazonDynamoDB(final DynamoDBLocal configuration) { return AmazonDynamoDBClientBuilder .standard() .withEndpointConfiguration( new AwsClientBuilder.EndpointConfiguration( "http://localhost:8000", Regions.US_EAST_1.getName() ) ) .withCredentials( new AWSStaticCredentialsProvider( // DynamoDB Local works with any non-null credentials new BasicAWSCredentials("", "") ) ) .build(); }
Now to the original questions:
> You have to somehow start the server before your tests run
You can just start it manually, or prepare a developsers' script for it. IDEs usually provide a way to run arbitrary commands before executing a task, so you can [make IDE](https://www.jetbrains.com/help/idea/run-debug-configuration-junit.html#before-launch-options) to start the container for you. I think that running something locally should not be a top priority in this case, but instead you should focus on configuring CI and let the developers start the container as it's comfortable to them.
>The server isn't started and stopped before each test so tests become inter-dependent unless you add code to delete all tables, etc. after each test
That's trueee, but… You should not start and stop such heavyweight things and recreate tables before / after each test. DB tests are almost always inter-dependent and that's ok for them. Just use unique values for each test case (e.g. set item's hash key to ticket id / specific test case id you're working on). As for the seed data, I'd recommend moving it from the build tool and test code as well. Either make your own image with all the data you need or use AWS CLI to create tables and insert data. Follow the single responsibility principle and dependency injection principles: your test code must not do anything but tests. All the environment (tables and data in this case should be provided for them). Creating a table in a test is wrong, because in a real life that table already exist (unless you're testing a method that actually creates a table, of course).
>All developers need to have it installed
Docker should be a must for every developer in 2018, so that's not a problem.
----------
And if you're using JUnit 5, it can be a good idea to use a [DynamoDB Local extension](https://gitlab.com/madhead/aws-junit5/blob/master/docs/dynamodb.adoc) that will inject the client in your tests (yes, I'm doing a self-promotion):
1. Add [JCenter](https://bintray.com/bintray/jcenter) repository to your build.
*pom.xml*:
<repositories> <repository> <snapshots> <enabled>false</enabled> </snapshots> <id>central</id> <name>bintray</name> <url>https://jcenter.bintray.com</url> </repository> </repositories>
*build.gradle*
repositories { jcenter() }
1. Add a dependency on `by.dev.madhead.aws-junit5:dynamodb-v1`
*pom.xml*:
<dependency> <groupId>by.dev.madhead.aws-junit5</groupId> <artifactId>dynamodb-v1</artifactId> <version>1.0.0</version> <scope>test</scope> </dependency>
*build.gradle*
dependencies { testImplementation("by.dev.madhead.aws-junit5:dynamodb-v1:1.0.0") }
1. Use the extension in your tests:
@ExtendWith(DynamoDBLocalExtension.class) class MultipleInjectionsTest { @DynamoDBLocal( url = "http://dynamodb-local-1:8000" ) private AmazonDynamoDB first;
@DynamoDBLocal( urlEnvironmentVariable = "DYNAMODB_LOCAL_URL" ) private AmazonDynamoDB second;
@Test void test() { first.listTables(); second.listTables(); } }

        
Present in both answers; Present only in the new answer; Present only in the old answer;