&hW{
zs)HDTQjM2x$H+;PCcqnqr07sfPS(Vgu4BDi{ms#?cMX41eGQ~Afk_c5v<^}t`=Y>+
z20It0KsnlPdGD{30$=s0T(JNA5BpjO==<{K;VD-zgz|Io?2i~hWaF7-ksyl%`f$1n
z-Nx^#GHNR^YV#Ev&%Ztlm7fxrHZ9|;fXW#DOYy(c`s_
zX=n#jAKhqJ;2>uV$Cb@>sW-%3b52$*5(0@t6RCS5c;F2O<~eo%>0y*UB$gg}l=^ix
zq(ypzQ&WDYR3twl{b}Q#5o$3tJbjYhefvqP=1EqRRBzpG0Hj)BM(GaUHqBr;eWa7F
z0k9swL8;bVODp9x9R;KZgaFM-Hy@M)dWAa8Q^lK~Zu8SuWt@vm&Fv4CtC@4DTr}E7
zGF1fR87;vE1b}6OUTu21=yA4?XtsZrSaLeML5=Z3}mye9lu)WYPLT`sdTHTwmT
zV?9ayF|XNmAzk;=@K&McnE?d!GBb!7Vc`Y7nMX)q|
zL2BCT79OM9tU#YOZk8lRJruyY!2^xr1<0QsR%7v%8zTHJ4aGKQgb}eY%Fa8FMM%G9
z(oU@VNznU&`pAZ%6Km4<2{D>q&znW&rEu{j{9c2|JmS1kU*m;++^lBtZ*!$zaY+<0@CPrst`r@L#*0}#!XA5#r`TZM&2`U@_rI(}~J
z4Hd20@UHw+ZwYwBRw8&a*yVzA8z~&m>lIVoDu1cpdL63IIoDfC3o58Bi-al(q*Et7
zhumJo$%8}EBzZa2);M}Oc3(cmO1T!
znV|R+(fj4NE@xDB=+6jD^PX*~(lQ;X+XV)Oh_v7cc#M%_SvJcHHZkf9(m+wOS(L+z
z%kQ;!6cSJWj&*KSd}Kw1|2IXLsHe(-DDHD!}#S1do_VBm-}7WBfg?I_|FzI7-a|
zEkXkB=ZSN3-1cJ)P*Lq1c6h>wig_Z6XMZjyuDiSHkL+uG>fdKa*`2Z2_Xrlf=`#Ki
z!c|w29@y=h*twbLURFvIw?T6c$fBb)_NeQ@9NvAh@3#nb8AxN%#z~r1W=I+G=08$O
z^Qy)NjV9o1GBKlGT$oRod`O`XvNIXjpL<66z#@@7>IV{gu4GNcDF==zpwYWN4h$t)r4pSQfQlcgO8
ziZ$Zid?qx%Q87
z#?u9JNd_^r0q5zMjYYa5j<1IU_Eq{W)VWcDwin&%G>SHE%nFl+%-%U*>7q`i8z)q0
zNLKFx(v|CyY)o%7P&Kbr7zrw*{I{dW5Pqg@Q5Rcop>LKZ_iR=ddb#HoFK~2L8R-M+
zT#?Y5;BS`c>Mlrn_5EaFTyTC9jDqKgS96oVM^Pa5_T(MR7UX5Aig}e64QUng5@pgL
zo+>lSvdWCJ3F2Z!=ndO9#<`1r=_}j#QW~S$LtjI$_dSd8$^(;=+!RdegzpwTx&7RE
zi~jI}!Gay!bKEFzlVw9z|JV!;-wN52bX1@(l*?bKsP?sBR9}72OkHQT{3XBh7g
z3%aI4vJWR8`*i6q*cu5q36J$2NOTWlWUZQVvHdMY2k_-VMG+|UR_o#=pf5QG$91!x
zxuJSk-!OG^lhHKAa6#*W9c>ZUr6Qg<%t}eYAP<2fFhRFzx_qq??I)+cgmuS5$@;FY
zh^rD@YYczlzHSKFV*MYg`&Uk^e
zWpg#Ab^YtQL6!uo8m#xFobJvs#$wSs#`cS
z2P`Rz*O<}A^tr=uZMGG@=Fya<+q$6bD={n!oV#=+_AOw<03NPgya&
z;lN~*M+tRl`kW~)Qx!X0PBc-AWsO%0-I}Lg!Bp8(okQ5f`%5YgO45c#;P2BYR&XtQ
ziHtab~%_A5Cs}e_@
zRy^<9Rzf}#MW&Mv;NRy7wI+>{+Gee!n(>G^A(62)LFCMZ$Y(UA;CUo#(RUN=7zRdO%eCk&D9oy#_ZN
z+wOj^v(yxXTF5x2u23e-lwbY{OS=AJW|2?*%=b5Z>7v{Mu8zV#}{@8IU5+Wf{^YeLn$xn!m=(X8&@8mAr^g_V`h
zvC)+#3$I(f3ZcF*#evQ1d4KTL3CXJjG)VX$IBh>3BhXuh+z?%nd?>~rXIngvUuBDX
zqDqWBh0n6gO95b{YyY|r&X2j>Z#`Ph2)kCIi;K=uBedE4G2N(9tR&)Tt$1U~!#Gie
zzaTdLL|6p@d7i)~#rC#2$akWKmHR`JOj;(>4U4x?mQ-G@bTZ3px6ueH8d&gvBe%LI
zw23ITDq;FKVpXU%0pH~zx#LKiOCO1_k!aOXg>9Ro@KKXBx4sr4G6VMD`{V?|SkVV+
z+G4U^E)Xv0+tvpAk{twB5fx>B%9chW9uZ2l$S_Z5yPNtN!AO%xi5nHLSOC-nQEqAT
zZwn2P&Cd@>)%6(t=}`Yq0AvT7_zLq6tmh)`?KK$A=iI2|=cvEvbRH`U9peKSuC7|eV#DIN;P6X{V+Vp8)Bc&$NJ~~{zZOl+`P*|e*eSr+M|Ckro6p=BA_K1
zEwkd*Pz$$%jYe5_vwSsijiQ=wMHe9t}J9W
zPY9XOUjP6F@JU2LR8nCc%3C;1tM_3j!-au`v#{kU!^UkHE~C?!p5};DtD!MxetbG#
zKZK2KtIEQ;SQ3%}C2vLtmdD);IQ_2*CQs@gZ<;^o;VoO9Bgvv*ST8Yh99oOd#nvFp
zXW@RuRyjE9AIw;u&M=16)mvb*>5*1$arI@>oLM*v6X)N);pP~qlE-639d5m+*_+?i
z2#X`8Bpbn_}%fN>8vrCayW|_?{*&nf?Y{EBX&+uS*I>1kwl!p0X^%t=88un|j
zX$F!U#>=yYYgp%v)U%Qd%SZ*DhSvqCP?dTrZXn*w3$z9%utCWKpXS#<1Y!TkUlUA%
z^JsJ~-fA?87S6)NrT!K&p{eiV
zbDcEMsI`XabUHbT-zNTFrUY7DU5?{;aR*VWwJ=ilX)~+d8e`#3;IU=5{auZRq%YWu
z&%$|?CWZ4dv=q;uRmQHBr9>0Q7gyYh4}tr+V0O4h*adulNDx
z$q#?x`r>S@$l_hnNVd=JkwwF8V#>D&R-F|yb}T*&D{o{Z63D?#hND_O%kHhOEFv7e
zH(LL_)tXUW^i>R_l*XQ|(d(Bs*a*twys@1~hZyg_*?2r}$MH7LhSaSV!+Eu(;~Q(w
zeeLODsc;r2sVqq_U1O5ap$8-&fa^laI$c+~Yk7RmAMiIzZe?%Kb%i6>c#u7_~B7R$Ezmmh`DpUuUhb59e{xN)2UstEVUOYbt`fL*s_8DNDabtjJvW_I0JnWN?s*YAoSrf3o=ardCM=b*cW~A1uNb
zX(~`7MZ5u|!?4BoA@)_$I@=su`>e%2+{{9i1~r$e8e_Fq#Z~F%8WZXAbk
zS+!bv7S77w%Q$1JU2N4``J|mmgyq0;-3TbwSW*WS91}N|=v}Bs6l?127Z@bU!ldK4
zL1W8TH{9o6TeIC##X`EzTgH
zjtaR?S7&+I2Q+YOfgJ1=>H4SnTR}qn`(#tMeRGZamIGaLqY+Cvjww#UOA`YxX8aGl%r?wwvQ&)EZsayTq
zr`Y!;Talm1&2R^}t5xs&NFYCnXJbcSR>iuO&o>4p2;bcpOq@2
sVrgk?W&07sg0WAv7B0n-A+?$Oe{DakNnC#k@Bjb+07*qoM6N<$f<+^OoB#j-
literal 0
HcmV?d00001
diff --git a/docs/data-product-studio/snowplow-cli/index.md b/docs/data-product-studio/snowplow-cli/index.md
new file mode 100644
index 0000000000..30eda477d9
--- /dev/null
+++ b/docs/data-product-studio/snowplow-cli/index.md
@@ -0,0 +1,84 @@
+---
+title: Snowplow CLI
+sidebar_label: Snowplow CLI
+sidebar_position: 7
+---
+import Tabs from '@theme/Tabs';
+import TabItem from '@theme/TabItem';
+
+`snowplow-cli` brings data management elements of Snowplow Console into the command line. It allows you to download your data structures and data products to yaml/json files and publish them back to console. This enables git-ops-like workflows, with reviews and brancing.
+
+# Install
+
+Snowplow CLI can be installed with [homebrew](https://brew.sh/):
+```
+brew install snowplow-product/taps/snowplow-cli
+```
+
+Verify the installation with
+```
+snowplow-cli --help
+```
+
+For systems where homebrew is not available binaries for multiple platforms can be found in [releases](https://github.com/snowplow-product/snowplow-cli/releases).
+
+Example installation for `linux_x86_64` using `curl`
+
+```bash
+curl -L -o snowplow-cli https://github.com/snowplow-product/snowplow-cli/releases/latest/download/snowplow-cli_linux_x86_64
+chmod u+x snowplow-cli
+```
+
+Verify the installation with
+```
+./snowplow-cli --help
+```
+
+# Configure
+
+You will need three values.
+
+An API Key Id and the corresponding API Key (secret), which are generated from the [credentials section](https://console.snowplowanalytics.com/credentials) in BDP Console.
+
+The organization ID, which can be retrieved from the URL immediately following the .com when visiting BDP console:
+
+![](./images/orgID.png)
+
+Snowplow CLI can take its configuration from a variety of sources. More details are available from `./snowplow-cli data-structures --help`. Variations on these three examples should serve most cases.
+
+
+
+
+ ```bash
+ SNOWPLOW_CONSOLE_API_KEY_ID=********-****-****-****-************
+ SNOWPLOW_CONSOLE_API_KEY=********-****-****-****-************
+ SNOWPLOW_CONSOLE_ORG_ID=********-****-****-****-************
+ ```
+
+
+
+
+ ```yaml
+ console:
+ api-key-id: ********-****-****-****-************
+ api-key: ********-****-****-****-************
+ org-id: ********-****-****-****-************
+ ```
+
+
+
+
+ ```bash
+ ./snowplow-cli data-structures --api-key-id ********-****-****-****-************ --api-key ********-****-****-****-************ --org-id ********-****-****-****-************
+ ```
+
+
+
+
+Snowplow CLI defaults to yaml format. It can be changed to json by either providing a `--output-format json` flag or setting the `output-format: json` config value. It will work for all commands where it matters, not only for `generate`.
+
+
+# Use cases
+
+- [Manage your data structures with snowplow-cli](/docs/data-product-studio/data-structures/manage/cli/index.md)
+- [Set up a github CI/CD pipeline to manage data structures and data products](/docs/resources/recipes-tutorials/recipe-data-structures-in-git/index.md)
diff --git a/docs/data-product-studio/snowplow-cli/reference/index.md b/docs/data-product-studio/snowplow-cli/reference/index.md
new file mode 100644
index 0000000000..7b912230af
--- /dev/null
+++ b/docs/data-product-studio/snowplow-cli/reference/index.md
@@ -0,0 +1,542 @@
+---
+title: Command Reference
+date: 2024-11-26
+sidebar_label: Command Reference
+sidebar_position: 1
+---
+
+This page contains the complete reference for the Snowplow CLI commands.
+
+## Data-Products
+
+
+Work with Snowplow data products
+
+### Examples
+
+```
+ $ snowplow-cli data-products validate
+```
+
+### Options
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ -h, --help help for data-products
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+```
+
+### Options inherited from parent commands
+
+```
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ --json-output Log output as json
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
+## Data-Products Download
+
+
+Download all data products, event specs and source apps from BDP Console
+
+### Synopsis
+
+Downloads the latest versions of all data products, event specs and source apps from BDP Console.
+
+If no directory is provided then defaults to 'data-products' in the current directory. Source apps are stored in the nested 'source-apps' directory
+
+```
+snowplow-cli data-products download {directory ./data-products} [flags]
+```
+
+### Examples
+
+```
+ $ snowplow-cli dp download
+ $ snowplow-cli dp download ./my-data-products
+```
+
+### Options
+
+```
+ -h, --help help for download
+ -f, --output-format string Format of the files to read/write. json or yaml are supported (default "yaml")
+```
+
+### Options inherited from parent commands
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ --json-output Log output as json
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
+## Data-Products Publish
+
+
+Publish all data products, event specs and source apps to BDP Console
+
+### Synopsis
+
+Publish the local version versions of all data products, event specs and source apps from BDP Console.
+
+If no directory is provided then defaults to 'data-products' in the current directory. Source apps are stored in the nested 'source-apps' directory
+
+```
+snowplow-cli data-products publish {directory ./data-products} [flags]
+```
+
+### Examples
+
+```
+ $ snowplow-cli dp publish
+ $ snowplow-cli dp download ./my-data-products
+```
+
+### Options
+
+```
+ -h, --help help for publish
+```
+
+### Options inherited from parent commands
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ --json-output Log output as json
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
+## Data-Products Validate
+
+
+Validate data structures with BDP Console
+
+### Synopsis
+
+Sends all data products and source applications from \ for validation by BDP Console.
+
+```
+snowplow-cli data-products validate [paths...] [flags]
+```
+
+### Examples
+
+```
+ $ snowplow-cli dp validate ./data-products ./source-applications
+ $ snowplow-cli dp validate ./src
+```
+
+### Options
+
+```
+ --gh-annotate Output suitable for github workflow annotation (ignores -s)
+ -h, --help help for validate
+```
+
+### Options inherited from parent commands
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ --json-output Log output as json
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
+## Data-Structures
+
+
+Work with Snowplow data structures
+
+### Examples
+
+```
+ $ snowplow-cli data-structures generate my_new_data_structure
+ $ snowplow-cli ds validate
+ $ snowplow-cli ds publish dev
+```
+
+### Options
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ -h, --help help for data-structures
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+```
+
+### Options inherited from parent commands
+
+```
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ --json-output Log output as json
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
+## Data-Structures Download
+
+
+Download all data structures from BDP Console
+
+### Synopsis
+
+Downloads the latest versions of all data structures from BDP Console.
+
+Will retrieve schema contents from your development environment.
+If no directory is provided then defaults to 'data-structures' in the current directory.
+
+```
+snowplow-cli data-structures download {directory ./data-structures} [flags]
+```
+
+### Examples
+
+```
+ $ snowplow-cli ds download
+ $ snowplow-cli ds download --output-format json ./my-data-structures
+```
+
+### Options
+
+```
+ -h, --help help for download
+ -f, --output-format string Format of the files to read/write. json or yaml are supported (default "yaml")
+```
+
+### Options inherited from parent commands
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ --json-output Log output as json
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
+## Data-Structures Generate
+
+
+Generate a new data structure locally
+
+### Synopsis
+
+Will write a new data structure to file based on the arguments provided.
+
+Example:
+ $ snowplow-cli ds gen login_click --vendor com.example
+ Will result in a new data structure getting written to './data-structures/com.example/login_click.yaml'
+ The directory 'com.example' will be created automatically.
+
+ $ snowplow-cli ds gen login_click
+ Will result in a new data structure getting written to './data-structures/login_click.yaml' with
+ an empty vendor field. Note that vendor is a required field and will cause a validation error if not completed.
+
+```
+snowplow-cli data-structures generate login_click {directory ./data-structures} [flags]
+```
+
+### Examples
+
+```
+ $ snowplow-cli ds generate my-ds
+ $ snowplow-cli ds generate my-ds ./my-data-structures
+```
+
+### Options
+
+```
+ --entity Generate data structure as an entity
+ --event Generate data structure as an event (default true)
+ -h, --help help for generate
+ --output-format string Format for the file (yaml|json) (default "yaml")
+ --vendor string A vendor for the data structure.
+ Must conform to the regex pattern [a-zA-Z0-9-_.]+
+```
+
+### Options inherited from parent commands
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ --json-output Log output as json
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
+## Data-Structures Publish
+
+
+Publishing commands for data structures
+
+### Synopsis
+
+Publishing commands for data structures
+
+Publish local data structures to BDP console.
+
+
+### Options
+
+```
+ -h, --help help for publish
+```
+
+### Options inherited from parent commands
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ --json-output Log output as json
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
+## Data-Structures Publish Dev
+
+
+Publish data structures to your development environment
+
+### Synopsis
+
+Publish modified data structures to BDP Console and your development environment
+
+The 'meta' section of a data structure is not versioned within BDP Console.
+Changes to it will be published by this command.
+
+
+```
+snowplow-cli data-structures publish dev [paths...] default: [./data-structures] [flags]
+```
+
+### Examples
+
+```
+ $ snowplow-cli ds publish dev
+ $ snowplow-cli ds publish dev --dry-run
+ $ snowplow-cli ds publish dev --dry-run ./my-data-structures ./my-other-data-structures
+```
+
+### Options
+
+```
+ -d, --dry-run Only print planned changes without performing them
+ --gh-annotate Output suitable for github workflow annotation (ignores -s)
+ -h, --help help for dev
+```
+
+### Options inherited from parent commands
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ --json-output Log output as json
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
+## Data-Structures Publish Prod
+
+
+Publish data structures to your production environment
+
+### Synopsis
+
+Publish data structures from your development to your production environment
+
+Data structures found on \ which are deployed to your development
+environment will be published to your production environment.
+
+
+```
+snowplow-cli data-structures publish prod [paths...] default: [./data-structures] [flags]
+```
+
+### Examples
+
+```
+
+ $ snowplow-cli ds publish prod
+ $ snowplow-cli ds publish prod --dry-run
+ $ snowplow-cli ds publish prod --dry-run ./my-data-structures ./my-other-data-structures
+
+```
+
+### Options
+
+```
+ -d, --dry-run Only print planned changes without performing them
+ -h, --help help for prod
+```
+
+### Options inherited from parent commands
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ --json-output Log output as json
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
+## Data-Structures Validate
+
+
+Validate data structures with BDP Console
+
+### Synopsis
+
+Sends all data structures from \ for validation by BDP Console.
+
+```
+snowplow-cli data-structures validate [paths...] default: [./data-structures] [flags]
+```
+
+### Examples
+
+```
+ $ snowplow-cli ds validate
+ $ snowplow-cli ds validate ./my-data-structures ./my-other-data-structures
+```
+
+### Options
+
+```
+ --gh-annotate Output suitable for github workflow annotation (ignores -s)
+ -h, --help help for validate
+```
+
+### Options inherited from parent commands
+
+```
+ -S, --api-key string BDP console api key
+ -a, --api-key-id string BDP console api key id
+ --config string Config file. Defaults to $HOME/.config/snowplow/snowplow.yml
+ Then on:
+ Unix $XDG_CONFIG_HOME/snowplow/snowplow.yml
+ Darwin $HOME/Library/Application Support/snowplow/snowplow.yml
+ Windows %AppData%\snowplow\snowplow.yml
+ --debug Log output level to Debug
+ -H, --host string BDP console host (default "https://console.snowplowanalytics.com")
+ --json-output Log output as json
+ -m, --managed-from string Link to a github repo where the data structure is managed
+ -o, --org-id string Your organization id
+ -q, --quiet Log output level to Warn
+ -s, --silent Disable output
+```
+
+
+
diff --git a/docs/data-product-studio/snowtype/index.md b/docs/data-product-studio/snowtype/index.md
index bc76f956e7..8bd820315a 100644
--- a/docs/data-product-studio/snowtype/index.md
+++ b/docs/data-product-studio/snowtype/index.md
@@ -1,7 +1,7 @@
---
title: "Code Generation - automatically generate code for Snowplow tracking SDKs"
sidebar_position: 6
-sidebar_label: "Snowtype"
+sidebar_label: "Code generation (Snowtype)"
sidebar_custom_props:
offerings:
- bdp
diff --git a/docs/resources/recipes-tutorials/recipe-data-structures-in-git/index.md b/docs/resources/recipes-tutorials/recipe-data-structures-in-git/index.md
index 732d828966..57415e0a78 100644
--- a/docs/resources/recipes-tutorials/recipe-data-structures-in-git/index.md
+++ b/docs/resources/recipes-tutorials/recipe-data-structures-in-git/index.md
@@ -16,19 +16,19 @@ The Snowplow Console's UI offers excellent facilities to get started quickly wit
A common solution when faced with these requirements is to move management to some form of version control platform (github/gitlab). This opens up an entire ecosystem of tools and patterns enabling all manner of custom workflows.
-We have built [snowplow-cli](/docs/data-product-studio/data-structures/manage/cli/index.md) to help you bridge the gap between these repository based workflows and BDP Console.
+We have built [Snowplow CLI](/docs/data-product-studio/snowplow-cli/index.md) to help you bridge the gap between these repository-based workflows and BDP Console.
## Prerequisites
* A deployed Snowplow BDP pipeline
-* [snowplow-cli](/docs/data-product-studio/data-structures/manage/cli/index.md#download) downloaded and configured
+* [Snowplow CLI](/docs/data-product-studio/snowplow-cli/index.md) installed and configured
* A familiarity with [git](https://git-scm.com/) and an understanding of [github actions](https://docs.github.com/en/actions/writing-workflows)
* A sensible [terminal emulator](https://en.wikipedia.org/wiki/Terminal_emulator) and shell
## What you'll be doing
-This recipe will walk through creating and deploying a data structure from the command line using [snowplow-cli](https://github.com/snowplow-product/snowplow-cli). It will then show how it is possible to automate the validation and deployment process using [github actions](https://docs.github.com/en/actions/writing-workflows).
+This recipe will walk through creating and deploying a data structure from the command line using [Snowplow CLI](/docs/data-product-studio/snowplow-cli/index.md). It will then show how it is possible to automate the validation and deployment process using [github actions](https://docs.github.com/en/actions/writing-workflows).
## Create a local data structure
@@ -360,7 +360,206 @@ Validation has passed. Now our colleagues can feedback on our changes and if eve
Finally, once we are convinced everything works we can open another pull request from `develop` to `main`, merge that and trigger our `publish-production.yml` workflow.
+## Following up with data products
+
+Now that we have our data structures set up, we can define data products to organize and document how these structures are used across our applications. We'll walk through creating source applications, data products, and event specifications using the CLI, then integrate them into our automated workflows.
+
+### Create a source applications
+
+First, we'll create a source application to represent our website that will send the `login` event we defined earlier.
+
+```bash
+snowplow-cli dp generate --source-app website
+```
+:::note
+`dp` is an alias for `data-products`. Source applications and event specifications are also managed by this command
+:::
+
+This should provide the following output
+```
+INFO generate wrote kind="source app" file=data-products/source-apps/website.yaml
+```
+
+The generated file is written to the default `data-products/source-apps` directory. Help for all the arguments available to `generate` is available by running `snowplow-cli dp generate --help`.
+
+Let's examine the generated file:
+
+```yml title="data-products/source-apps/website.yaml"
+apiVersion: v1
+resourceType: source-application
+resourceName: b8261a25-ee81-4c6a-a94c-7717ba835035
+data:
+ name: website
+ appIds: []
+ entities:
+ tracked: []
+ enriched: []
+```
+
+* `apiVersion` should always be `v1`
+* `resourceType` should remain `source-application`
+* `resourceName` is a unique identifier of the source applications. It must be a valid uuid v4
+* `data` is the contents of the source app
+
+Now let's customize our source application. We'll configure it to handle events from our production website as well as staging and UAT environments. We'll also add an owner field and remove the unused entities section.
+
+```yml {6-7} title="data-products/source-apps/website.yaml"
+apiVersion: v1
+resourceType: source-application
+resourceName: b8261a25-ee81-4c6a-a94c-7717ba835035
+data:
+ name: website
+ appIds: ["website", "website-stage", "website-ua"]
+ owner: me@example.com
+```
+
+Before publishing, we can validate our changes and preview what will happen:
+
+```bash
+snowplow-cli dp publish --dry-run
+```
+
+The command will show us the planned changes:
+```
+publish will create source apps file=.../data-products/source-apps/website.yaml name=website resource name=b8261a25-ee81-4c6a-a94c-7717ba835035
+```
+
+When we're happy with the proposed changes, we can publish by removing the `--dry-run` flag:
+
+```bash
+snowplow-cli dp publish
+```
+
+After publishing, you'll be able to see your new source application in the BDP Console UI.
+
+### Create a data product and an event specification
+
+Let's now create a data product and an event specification by running the following command
+
+```bash
+snowplow-cli dp generate --data-product Login
+```
+This should provide the following output
+```
+INFO generate wrote kind="data product" file=data-products/login.yaml
+```
+Let's see what it has created for us
+
+```yml title="data-products/login.yaml"
+apiVersion: v1
+resourceType: data-product
+resourceName: 0edb4b95-3308-40c4-b266-eae2910d5d2a
+data:
+ name: Login
+ sourceApplications: []
+ eventSpecifications: []
+```
+
+Let's amend it to add an event specification, and a reference to a source application:
+
+```yml {6,7,9,11-14} title="data-products/login.yaml"
+apiVersion: v1
+resourceType: data-product
+resourceName: 0edb4b95-3308-40c4-b266-eae2910d5d2a
+data:
+ name: Login
+ owner: me@example.com
+ description: Login page
+ sourceApplications:
+ - $ref: ./source-apps/website.yaml
+ eventSpecifications:
+ - resourceName: cfb3a227-0482-4ea9-8b0d-f5a569e5d103
+ name: Login success
+ event:
+ source: iglu:com.example/login/jsonschema/1-0-1
+```
+
+:::note
+You'll need to come up with a valid uuid V4 for the `resourceName` of an event specification. You can do so by using an (online generator)[https://www.uuidgenerator.net], or running the `uuidgen` command in your terminal
+:::
+
+:::caution Warning
+
+The `iglu:com.example/login/jsonschema/1-0-1` data structure has to be deployed at least to a develop envinroment. Currently referencing local data structures is not supported
+
+:::
+
+We can run the same `publish --dry-run` command as before, to see if the output is as expected. The output should contain the following lines
+
+```bash
+snowplow-cli dp publish --dry-run
+```
+
+```
+INFO publish will create data product file=.../data-products/login.yaml name=Login resource name=0edb4b95-3308-40c4-b266-eae2910d5d2a
+INFO publish will update event specifications file=.../data-products/login.yaml name="Login success" resource name=cfb3a227-0482-4ea9-8b0d-f5a569e5d103 in data product=0edb4b95-3308-40c4-b266-eae29
+```
+
+We can apply the changes by using the publish command without the `--dry-run` flag
+
+```bash
+snowplow-cli dp publish
+```
+
+### Add data products validation and publishing in the github actions
+
+Now that we've modeled a source application, data product and event specification, let's see how we can add them to the existing github actions workflows for data structures. You can customize your setup, use a separate repository or a separate actions, but in this example we'll add the data products publishing into the existing workflows.
+
+Lets modify the PR example, and add the following line. This command will validate and print the changes to the github actions log.
+
+```yml {20} title=".github/workflows/validate-pull-request.yml"
+on:
+ pull_request:
+ branches: [develop, main]
+
+jobs:
+ validate:
+ runs-on: ubuntu-latest
+ env:
+ SNOWPLOW_CONSOLE_ORG_ID: ${{ secrets.SNOWPLOW_CONSOLE_ORG_ID }}
+ SNOWPLOW_CONSOLE_API_KEY_ID: ${{ secrets.SNOWPLOW_CONSOLE_API_KEY_ID }}
+ SNOWPLOW_CONSOLE_API_KEY: ${{ secrets.SNOWPLOW_CONSOLE_API_KEY }}
+
+ steps:
+ - uses: actions/checkout@v4
+
+ - uses: snowplow-product/setup-snowplow-cli@v1
+
+ - run: snowplow-cli ds validate --gh-annotate
+
+ - run: snowplow-cli dp publish --dry-run --gh-annotate
+```
+
+Data products, source applications and event specifications don't have the dev and prod environments, so it's enough to publish them once.
+We can add the same command but without the `--dry-run` flag to the publish pipeline.
+
+```yml {20} title=".github/workflows/publish-develop.yml"
+on:
+ push:
+ branches: [develop]
+
+jobs:
+ publish:
+ runs-on: ubuntu-latest
+ env:
+ SNOWPLOW_CONSOLE_ORG_ID: ${{ secrets.SNOWPLOW_CONSOLE_ORG_ID }}
+ SNOWPLOW_CONSOLE_API_KEY_ID: ${{ secrets.SNOWPLOW_CONSOLE_API_KEY_ID }}
+ SNOWPLOW_CONSOLE_API_KEY: ${{ secrets.SNOWPLOW_CONSOLE_API_KEY }}
+
+ steps:
+ - uses: actions/checkout@v4
+
+ - uses: snowplow-product/setup-snowplow-cli@v1
+
+ - run: snowplow-cli ds publish dev --managed-from $GITHUB_REPOSITORY
+
+ - run: snowplow-cli dp publish
+```
+
+You might want to publish data products in the `.github/workflows/publish-production.yml` as well, or only there. It depends on your setup, but if you strictly follow the rules and always merge to `main` from `develop`, the setup above should be enough.
+
## Let's break down what we've done
* We have seen how snowplow-cli can be used to work with data structures from the command line
* We have applied that knowledge to build github workflows which support automated validation and publication
+* We have added source applications, data products and event specifications to use the same approach as data structures.