Panther is a collection of serverless applications deployed within your AWS account. The frontend is a React application which runs in a Docker container (via ECS), and the backend is a collection of compute (Lambda), storage (DynamoDB / S3), and other supporting services.
The sections below cover how Panther works under the hood, how to build and deploy from source, and how to extend Panther to meet your individual needs.
This diagram provides an overview of the core components of Panther, and how they are connected.
For a more detailed architecture diagram, see the bottom of this page.
To deploy from source, install Docker and make sure the daemon is running in the background.
For the remaining dependencies, you can either use our development image or install development dependencies locally.
This is the easier option, but will also lead to much slower builds.
Simply export your AWS credentials as environment variables, and then run
./dev.sh From here, run
mage setup and you're good to go.
To install dependencies locally (recommended for regular contributors):
For example, on MacOS w/ Homebrew:
brew install go node@12 python3export PATH=$HOME/go/bin:$PATHgo get github.com/magefile/magego get golang.org/x/tools/cmd/goimports
Then install the remaining development libraries:
Panther uses mage, a Go tool similar to
make , to manage the development lifecycle.
mage from the repo root to see the list of available commands:
Targets:build:api Generate API source files from GraphQL + Swaggerbuild:cfn Generate CloudFormation templates in out/deployments folderbuild:lambda Compile Go Lambda function sourcebuild:tools Compile devtools and opstoolsclean Remove dev libraries and build/test artifactsdeploy Deploy Panther to your AWS accountdoc Auto-generate specific sections of documentationfmt Format source filesmaster:deploy Deploy single master template (deployments/master.yml) nesting all other stacksmaster:publish Publish a new Panther release (Panther team only)setup Install all build and development dependenciesteardown Destroy all Panther infrastructuretest:cfn Lint CloudFormation and Terraform templatestest:ci Run all required checks for a pull requesttest:doc Verify links and assets in documentationtest:go Test and lint Golang source codetest:integration Run integration tests (integration_test.go,integration.py)test:python Test and lint Python source codetest:web Test and lint web source
You can easily chain
mage commands together, for example:
mage clean setup test:ci deploy
Since the majority of Panther is written in Go, the repo follows the standard Go project layout:
Input/output models for communicating with Panther's backend APIs
Dockerfiles for CI and deployment
Go dev and ops tools
CloudFormation templates for deploying Panther itself or integrating the accounts you want to scan
Documentation, license headers, README, images, code of conduct, etc
Source code for all of Panther's Lambda functions
Standalone Go libraries that could be directly imported by other projects
Magefile source and other build infrastructure
Source for the Panther web application
Run our test suite:
Run integration tests against a live deployment:
To run tests for only one package:
PKG=./internal/compliance/compliance-api/main mage test:integration
Configure your AWS credentials and deployment region:
export AWS_REGION=us-east-1 # Any supported regionexport AWS_ACCESS_KEY_ID=...export AWS_SECRET_ACCESS_KEY=...
Panther relies on dozens of AWS services, some of which are not yet available in every region. In particular, AppSync, Cognito, Athena, and Glue are newer services not available in us-gov, china, and other regions. At the time of writing, all Panther backend components are supported in the following:
us-east-1 (n. virginia)
Consult the AWS region table for the source of truth about service availability in each region.
Now you can run
If you're using the development image, be sure to export your AWS credentials in the environment before running
If your credentials timeout, you can safely redeploy to pick up where you left off.
If you use
aws-vault, you must be authenticated with MFA. Otherwise, IAM role creation will fail with
You can also update a single stack in an existing deployment:
STACK=appsync mage deploy
This will deploy the main CloudFormation stacks independently and is optimized for development. If instead you want to deploy the single master template:
Panther relies on a number of custom CloudFormation resources. Like any resource, these will not be updated unless the input parameters have changed. You can force an update of most custom resources by overriding their version:
CUSTOM_RESOURCE_VERSION=v1.5.0 mage deploy
You can also deploy from an EC2 instance with Docker and git installed in the same region you're deploying Panther to. Instead of exporting your AWS credentials as environment variables, you will need to attach the deployment IAM role to your EC2 instance profile. We recommend at least an
m5.large instance type, but even one as small as
t2.small should be sufficient.
mage teardown to remove all Panther infrastructure, including S3 buckets that are normally retained if just the CloudFormation stack were deleted. If you have a single top-level Panther stack, then
STACK=your-stack-name mage teardown
In summary, teardown will:
Delete all Panther stacks
Delete all S3 buckets tagged with
If there are too many items to delete manually, an expiration policy is set instead
Delete all CloudWatch log groups prefixed with
Be careful running teardown if you have your own "panther" service which might be writing log groups with that prefix
The diagrams below can be used to understand Panther's architecture at a deeper level and provide insight into data flows.
This diagram provides additional detail to the high-level diagram above:
While more detailed than the overview above, this diagram also simplifies some implementation details for clarity. For example, the majority of lambdas are not invoking each other directly but instead communicating via SQS Queues or DynamoDB streams.
This diagram shows where and how your data is stored and processed:
The above arrows indicate the direction in which data is transferred, as opposed to the previous diagrams where arrows are indicating the direction that communication is being initiated.