An easily-deployable, single instance version of Snowplow that serves three use cases:
- Gives a Snowplow consumer (e.g. an analyst / data team / marketing team) a way to quickly understand what Snowplow "does" i.e. what you put in at one end and take out of the other
- Gives developers new to Snowplow an easy way to start with Snowplow and understand how the different pieces fit together
- Gives people running Snowplow a quick way to debug tracker updates (because they can)
- Data is tracked and processed in real time
- Added Iglu Server to allow for custom schemas to be uploaded
- Data is validated during processing
- This is done using both our standard Iglu schemas and any custom ones that you have loaded into the Iglu Server
- Data is loaded into Elasticsearch
- Can be queried directly or through a Kibana dashboard
- Good and bad events are in distinct indexes
- Create UI to indicate what is happening with each of the different subsystems (collector, enrich etc.), so as to provide developers a very indepth way of understanding how the different Snowplow subsystems work with one another
Snowplow Mini runs several distinct applications on the same box which are all linked by NSQ topics. In a production deployment each instance could be an Autoscaling Group and each NSQ topic would be a distinct Kinesis Stream.
- Stream Collector:
- Starts server listening on
http://< sp mini public ip>/
which events can be sent to. - Sends "good" events to the
RawEvents
NSQ topic - Sends "bad" events to the
BadEvents
NSQ topic
- Starts server listening on
- Stream Enrich:
- Reads events in from the
RawEvents
NSQ topic - Sends events which passed the enrichment process to the
EnrichedEvents
NSQ topic - Sends events which failed the enrichment process to the
BadEvents
NSQ topic
- Reads events in from the
- Elasticsearch Sink Good:
- Reads events from the
EnrichedEvents
NSQ topic - Sends those events to the
good
Elasticsearch index - On failure to insert, writes errors to
BadElasticsearchEvents
NSQ topic
- Reads events from the
- Elasticsearch Sink Bad:
- Reads events from the
BadEvents
NSQ topic - Sends those events to the
bad
Elasticsearch index - On failure to insert, writes errors to
BadElasticsearchEvents
NSQ topic
- Reads events from the
These events can then be viewed in Kibana at http://< sp mini public ip>/kibana
.
Documentation is available at our docs website.
Some advice on how to handle certain errors if you're trying to build this locally with Vagrant.
Your Vagrant version is probably outdated. Use Vagrant 2.0.0+.
This is caused by trying to use NFS. Comment the relevant lines in Vagrantfile
.
Most likely this will happen on TASK [sp_mini_5_build_ui : Install npm packages based on package.json.]
but see also: https://discourse.snowplowanalytics.com/t/snowplow-mini-local-vagrant/2930.
Snowplow Mini is copyright 2016-2021 Snowplow Analytics Ltd.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this software except in compliance with the License.
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.