Guide to surveilr Evidence Collection and Integration
Evidence Collection Workflow
To effectively collect data from cloud platforms like AWS, Azure, and others, we utilize tools such as Steampipe and CNquery. These tools interact with cloud infrastructure, retrieve necessary data, and store it in an RSSD SQLite format for subsequent querying and analysis.
Workflow Overview:
- Steampipe and CNquery Configuration: Retrieve data from cloud platforms using the tools. Save the queries in JSONL format for ingestion.
- Ingestion into surveilr: Use the surveilr ingest tasks process to import the JSONL files into the surveilr system and convert them into an SQLite database format.
- SQLite Database: The resulting SQLite database can be accessed, queried, and analyzed as needed.
To begin the process, execute the following command to ingest the data directly into surveilr:
cat cloud-steampipe-surveilr.jsonl | surveilr ingest tasks
This command ensures that cloud platform data are ingested and stored in the RSSD SQLite format for later use.
Prerequisites
Before proceeding, ensure that the following tools are installed on your Management Server. These tools are essential for interacting with cloud platforms and managing data ingestion:
- surveilr
- AWS CLI
- Steampipe
- CNquery
1. surveilr Installation
Ensure surveilr is installed on your Management Server before running the command cat cloud-steampipe-surveilr.jsonl | surveilr ingest tasks
.
You do not need to install surveilr on other nodes (servers). It only needs to be installed on the Management Server where the data ingestion process occurs.
To install surveilr on the Management Server, use one of the following methods:
Default Installation:
curl -sL https://raw.githubusercontent.com/opsfolio/releases.opsfolio.com/main/surveilr/install.sh | sh
Custom Installation Path:
SURVEILR_HOME="$HOME/bin" curl -sL https://raw.githubusercontent.com/opsfolio/releases.opsfolio.com/main/surveilr/install.sh | sh
Alternatively, you can use eget to install surveilr:
eget opsfolio/releases.opsfolio.com --asset tar.gz
For help on using surveilr commands:
surveilr --version # Get version infosurveilr --help # CLI helpsurveilr --completions fish | source # Shell completions for easier usage
2. AWS CLI Installation
To install AWS CLI, follow these steps based on your operating system.
Linux:
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"unzip awscliv2.zipsudo ./aws/install
macOS:
curl "https://awscli.amazonaws.com/AWSCLIV2.pkg" -o "AWSCLIV2.pkg"sudo installer -pkg AWSCLIV2.pkg -target /
Windows:
msiexec.exe /i https://awscli.amazonaws.com/AWSCLIV2.msi
For more information, refer to the AWS CLI Installation Guide.
3. Steampipe Installation
Steampipe is used for querying cloud services, requiring plugins for each cloud provider. Follow the steps below for installation:
macOS:
brew install turbot/tap/steampipe
Linux/Windows (WSL2):
sudo /bin/sh -c "$(curl -fsSL https://steampipe.io/install/steampipe.sh)"
After installation, install the necessary plugins for your cloud provider, e.g., AWS:
steampipe plugin install aws
For more information, refer to the Steampipe Plugin Documentation.
Configuration details are saved in the following directory:
~/.steampipe/config/
A sample configuration for AWS (aws.spc
):
connection "aws" { plugin = "aws" regions = ["us-xxxx"] access_key = "AKxxxxxxxxxxxxxxxxxxH" secret_key = "fSxxxxxxxxxxxxxxxxxxxx7t"}
To start the service:
steampipe service start
4. CNquery Installation
CNquery is another tool for querying system configurations. To install it, use the commands below:
Linux and macOS:
bash -c "$(curl -sSL https://install.mondoo.com/sh)"
Windows:
Set-ExecutionPolicy Unrestricted -Scope Process -Force;[System.Net.ServicePointManager]::SecurityProtocol = [System.Net.ServicePointManager]::SecurityProtocol -bor 3072;iex ((New-Object System.Net.WebClient).DownloadString('https://install.mondoo.com/ps1/cnquery'));Install-Mondoo -Product cnquery;
To run queries, use:
cnquery run TARGET -c "QUERY"
Example to list services and their statuses on a local system:
cnquery run local -c "services.list { name running }"
For more information, refer to the CNquery Documentation.
surveilrctl
surveilrctl automates the setup of osQuery and the connection of nodes to the osQuery management server (which will be running on the management server where surveilr is installed). This simplifies installation, certificate retrieval, and node configuration.
The following setup process should be executed on the nodes that will connect to the osQuery management server on the management server.
Quick Installation for surveilrctl on Nodes
Linux & macOS:
SURVEILR_HOST=https://your-host curl -sL surveilr.com/surveilrctl.sh | bash
Windows:
To install surveilrctl on Windows nodes, run:
irm https://surveilr.com/surveilrctl.ps1 | iex
For automatic setup:
$env:SURVEILR_HOST="https://your-host"; irm https://surveilr.com/surveilrctl.ps1 | iex
Note: Ensure to run PowerShell as Administrator on Windows.
surveilrctl Node Setup
To set up surveilrctl on the nodes and connect them to the osQuery management server, use the following command:
surveilrctl setup --uri https://your-host# Example:surveilrctl setup --uri https://osquery-ms.example.com
If the server requires Basic Authentication, use:
surveilrctl setup --uri https://osquery-ms.example.com --username admin --password securepass
To specify custom file paths for certificates and secrets:
surveilrctl setup --uri https://osquery-ms.example.com \ --cert-path /path/to/cert.pem \ --secret-path /path/to/secret.txt
To upgrade to the latest version of surveilrctl:
surveilrctl upgrade
Opsfolio Penetration Toolkit
The Opsfolio Penetration Toolkit is a comprehensive suite of tools for penetration testing. It includes Nmap for network discovery and security audits. The toolkit is fully automated via GitHub Actions, allowing scheduled tests without manual intervention.
Key Features:
- Automated Testing: Runs on GitHub-managed remote runners, enabling regular tests without manual effort.
- Comprehensive Toolset: Includes Nmap for network discovery and security audits.
- Centralized Reporting: Aggregates Nmap XML outputs into a single SQLite database for efficient querying and reporting.
- Advanced Querying: Use SQL to query and analyze the data stored in the SQLite database.
Configuring Variables in GitHub
To set up variables for the Nmap penetration testing workflow:
- Go to your GitHub repository.
- Navigate to Settings > Secrets and variables > Actions.
- Under Variables, click New repository variable.
- Name the variable
ENDPOINTS
and enter values in the format:hostname|ipaddress or domain_name|boundary
.- Example:
EC2_PRIME|19x.xx.xx.x7|AWS_EC2
- Example:
- Click Add variable.
Sharing the Generated RSSD File
Once you have completed the evidence collection and the RSSD SQLite file (named resource-surveillance.sqlite.db
) is generated, please share the file with us using any of the following methods:
- Google Drive: Upload the file to your Google Drive and share the link with us.
- Dropbox: Upload the file to Dropbox and send us the shared link.
- Other Cloud Services: You can also use other cloud file-sharing services that allow you to upload and share files via a link (e.g., OneDrive, Box, etc.).
Please make sure the file is accessible by link sharing, and ensure the correct permissions are set so we can access and analyze the data.