You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1.**Reading and Structuring Metadata:** Reading CSV files and creating organized metadata maps that stay associated with your data files.
882
874
883
-
2. **Expanding Metadata During Workflow** Adding new information to your metadata as your pipeline progresses by adding process outputs and deriving values through conditional logic
2.**Expanding Metadata During Workflow** Adding new information to your metadata as your pipeline progresses by adding process outputs and deriving values through conditional logic
886
884
887
-
```groovy
888
-
.map { meta, file, lang ->
889
-
[ meta + [lang:lang], file ]
890
-
}
891
-
```
885
+
- Adding new keys based on process output
892
886
893
-
- Adding new keys using a conditional clause
887
+
```groovy
888
+
.map { meta, file, lang ->
889
+
[ meta + [lang:lang], file ]
890
+
}
891
+
```
894
892
895
-
```groovy
896
-
.map{ meta, file ->
897
-
if ( meta.lang.equals("de") || meta.lang.equals('en') ){
898
-
lang_group = "germanic"
899
-
} else if ( meta.lang in ["fr", "es", "it"] ) {
900
-
lang_group = "romance"
901
-
} else {
902
-
lang_group = "unknown"
903
-
}
904
-
}
905
-
```
893
+
- Adding new keys using a conditional clause
906
894
907
-
3. **Customizing Process Behavior:** Using metadata to adapt how processes handle different files
895
+
```groovy
896
+
.map{ meta, file ->
897
+
if ( meta.lang.equals("de") || meta.lang.equals('en') ){
898
+
lang_group = "germanic"
899
+
} else if ( meta.lang in ["fr", "es", "it"] ) {
900
+
lang_group = "romance"
901
+
} else {
902
+
lang_group = "unknown"
903
+
}
904
+
}
905
+
```
908
906
909
-
- Using meta values in Process Directives
907
+
3.**Customizing Process Behavior:**Using metadata to adapt how processes handle different files
@@ -1120,14 +1120,13 @@ Mastering these channel operations will enable you to build flexible, scalable p
1120
1120
}
1121
1121
```
1122
1122
1123
-
2.**Splitting data into separate channels**: We used `filter` to divide data into independent streams based on the `type` field
1123
+
2.**Splitting data into separate channels:** We used `filter` to divide data into independent streams based on the `type` field
1124
1124
1125
1125
```groovy
1126
-
// Filter channel based on condition
1127
1126
channel.filter { it.type == 'tumor' }
1128
1127
```
1129
1128
1130
-
3.**Joining matched samples**: We used `join` to recombine related samples based on `id` and `repeat` fields
1129
+
3.**Joining matched samples:** We used `join` to recombine related samples based on `id` and `repeat` fields
1131
1130
1132
1131
- Join two channels by key (first element of tuple)
1133
1132
@@ -1153,33 +1152,32 @@ Mastering these channel operations will enable you to build flexible, scalable p
1153
1152
)
1154
1153
```
1155
1154
1156
-
4.**Distributing across intervals**: We used `combine` to create Cartesian products of samples with genomic intervals for parallel processing
1155
+
4.**Distributing across intervals:** We used `combine` to create Cartesian products of samples with genomic intervals for parallel processing
1157
1156
1158
1157
```groovy
1159
1158
samples_ch.combine(intervals_ch)
1160
1159
```
1161
1160
1162
-
5.**Aggregating by grouping keys**: We used `groupTuple` to group by the first element in each tuple, thereby collecting samples sharing `id` and `interval` fields and merging technical replicates
1161
+
5.**Aggregating by grouping keys:** We used `groupTuple` to group by the first element in each tuple, thereby collecting samples sharing `id` and `interval` fields and merging technical replicates
1163
1162
1164
1163
```groovy
1165
-
//
1166
1164
channel.groupTuple()
1167
1165
```
1168
1166
1169
-
-**Optimizing the data structure:** We used `subMap` to extract specific fields and created a named closure for making transformations reusable
1167
+
6.**Optimizing the data structure:** We used `subMap` to extract specific fields and created a named closure for making transformations reusable
0 commit comments