nuker π¨βπ»π 139Cleans up AWS resources based on configurable Rules. |
139
3
0
|
||||
![]() |
nukerCleans up AWS resources based on configurable Rules. |
139
|
3
|
0
|
|
ankus π¨βπ»π 670ANKUS is a deployment & orchestration tool for big data frameworks |
670
21
9
|
||||
![]() |
ankusANKUS is a deployment & orchestration tool for big data frameworks |
670
|
21
|
9
|
|
ankus-modules π¨βπ»π 191modules used by ankus to manage big-data frameworks |
191
5
3
|
||||
![]() |
ankus-modulesmodules used by ankus to manage big-data frameworks |
191
|
5
|
3
|
|
scripts π¨βπ»π 43Cloudwick Deployment Scripts |
43
2
0
|
||||
![]() |
scriptsCloudwick Deployment Scripts |
43
|
2
|
0
|
|
LogEventsProcessing π¨βπ»π 11real time log event processing using storm, kafka, logstash & cassandra |
11
47
25
|
||||
![]() |
LogEventsProcessingreal time log event processing using storm, kafka, logstash & cassandra |
11
|
47
|
25
|
|
generator π¨βπ»π 42Synthetic data generators for simulating real-time data and work loads |
42
10
4
|
||||
![]() |
generatorSynthetic data generators for simulating real-time data and work loads |
42
|
10
|
4
|
|
awscli π¨βπ»π 101amazon web services command line interface in ruby |
101
10
4
|
||||
![]() |
awscliamazon web services command line interface in ruby |
101
|
10
|
4
|
|
time-tracker π¨βπ»π 41Play application for management of timesheets, invoices, expense management and others |
41
0
0
|
||||
![]() |
time-trackerPlay application for management of timesheets, invoices, expense management and others |
41
|
0
|
0
|
|
benchmark π¨βπ»π 48Application to benchmark inserts, reads and queries of nosql data-stores |
48
12
4
|
||||
![]() |
benchmarkApplication to benchmark inserts, reads and queries of nosql data-stores |
48
|
12
|
4
|
|
storm-helloworld π¨βπ»π 4sample hello world topology with pom |
4
13
25
|
||||
![]() |
storm-helloworldsample hello world topology with pom |
4
|
13
|
25
|
|
LogEventsProcessingSpark π¨βπ»π 7real time log event processing using spark, kafka & cassandra |
7
13
18
|
||||
![]() |
LogEventsProcessingSparkreal time log event processing using spark, kafka & cassandra |
7
|
13
|
18
|
|
dotfiles π¨βπ»π 61Dotfiles |
61
3
0
|
||||
![]() |
dotfilesDotfiles |
61
|
3
|
0
|
|
scripts π¨βπ»π 55day-to-day dev-ops scripts |
55
1
5
|
||||
![]() |
scriptsday-to-day dev-ops scripts |
55
|
1
|
5
|
|
bulkload_mongo_mapreduce π¨βπ»π 50simple mapreduce program to bulk load hdfs data into mongodb |
50
0
0
|
||||
![]() |
bulkload_mongo_mapreducesimple mapreduce program to bulk load hdfs data into mongodb |
50
|
0
|
0
|
|
spark_codebase π¨βπ»π 22Collection of Spark core, streaming, sql, mllib examples & applications with base line unit tests |
22
6
4
|
||||
![]() |
spark_codebaseCollection of Spark core, streaming, sql, mllib examples & applications with base line unit tests |
22
|
6
|
4
|
|
spark-starter π¨βπ»π 3Sample Spark start application illustration wordcount with testsuite |
3
1
3
|
||||
![]() |
spark-starterSample Spark start application illustration wordcount with testsuite |
3
|
1
|
3
|
|
dscb π¨βπ»π 16Distributed Systems Code Base (for training purposes) |
16
3
1
|
||||
![]() |
dscbDistributed Systems Code Base (for training purposes) |
16
|
3
|
1
|
|
puppet_kerberos π¨βπ»π 11puppet module to install kerberos |
11
1
2
|
||||
![]() |
puppet_kerberospuppet module to install kerberos |
11
|
1
|
2
|
|
index_tweets_solr π¨βπ»π 2Indexes Twitter tweets using Solr |
2
2
0
|
||||
![]() |
index_tweets_solrIndexes Twitter tweets using Solr |
2
|
2
|
0
|
|
kafka_code_base π¨βπ»π 3Simple Kafka producer, consumer examples |
3
1
1
|
||||
![]() |
kafka_code_baseSimple Kafka producer, consumer examples |
3
|
1
|
1
|
|
mapreduce_training π¨βπ»π 8Set of MapReduce application's used for teaching purposes |
8
3
5
|
||||
![]() |
mapreduce_trainingSet of MapReduce application's used for teaching purposes |
8
|
3
|
5
|
|
log_analytics_mapreduce π¨βπ»π 12Analytics on top of http webserver log events using mapreduce |
12
5
6
|
||||
![]() |
log_analytics_mapreduceAnalytics on top of http webserver log events using mapreduce |
12
|
5
|
6
|
|
ncdc_data_processing π¨βπ»π 1Process/Analyze NCDC weather dataset using hadoop mapreduce |
1
1
5
|
||||
![]() |
ncdc_data_processingProcess/Analyze NCDC weather dataset using hadoop mapreduce |
1
|
1
|
5
|
|
cm_automation π¨βπ»π 11Cloudera Manager automation using Puppet, Chef & CM API |
11
0
2
|
||||
![]() |
cm_automationCloudera Manager automation using Puppet, Chef & CM API |
11
|
0
|
2
|
|
clickstream_generator π¨βπ»π 1Synthetic data generator for generating clickstream data |
1
1
1
|
||||
![]() |
clickstream_generatorSynthetic data generator for generating clickstream data |
1
|
1
|
1
|
|
puppet_java π¨βπ»π 10puppet module to install and manage java |
10
1
1
|
||||
![]() |
puppet_javapuppet module to install and manage java |
10
|
1
|
1
|
|
http_events_gen π¨βπ»π 4mocks http web requests (apache web server format) |
4
1
2
|
||||
![]() |
http_events_genmocks http web requests (apache web server format) |
4
|
1
|
2
|
|
puppet_module_scm π¨βπ»π 2puppet module to deploy and manage cloudera manager |
2
1
2
|
||||
![]() |
puppet_module_scmpuppet module to deploy and manage cloudera manager |
2
|
1
|
2
|
|
19
0
0
|
|||||
![]() |
19
|
0
|
0
|
||
s3-restore π¨βπ»π 11Restores deleted objects of an S3 version enabled bucket |
11
0
0
|
||||
![]() |
s3-restoreRestores deleted objects of an S3 version enabled bucket |
11
|
0
|
0
|
|
5
0
0
|
|||||
![]() |
5
|
0
|
0
|
||
udmp-wpa-supplicant-monitor π¨βπ»π 2Monitors the wpa-supplicant container |
2
0
0
|
||||
![]() |
udmp-wpa-supplicant-monitorMonitors the wpa-supplicant container |
2
|
0
|
0
|
|
9
0
1
|
|||||
![]() |
ashrithr.github.ioBlog |
9
|
0
|
1
|
|
play-cloudwickone-slick-template π¨βπ»π 1Play slick (postgres) template |
1
1
0
|
||||
![]() |
play-cloudwickone-slick-templatePlay slick (postgres) template |
1
|
1
|
0
|
|
centos-base π¨βπ»π 4Docker CentOS base image with SSH and Supervisor configured |
4
0
1
|
||||
![]() |
centos-baseDocker CentOS base image with SSH and Supervisor configured |
4
|
0
|
1
|
|
mapreduce_joins π¨βπ»π 6Simple examples to illustrate joins in mapreduce |
6
0
1
|
||||
![]() |
mapreduce_joinsSimple examples to illustrate joins in mapreduce |
6
|
0
|
1
|
|
game_data_gen π¨βπ»π 6Mocks data generated by gaming website |
6
0
2
|
||||
![]() |
game_data_genMocks data generated by gaming website |
6
|
0
|
2
|
|
puppet_logstash π¨βπ»π 9puppet module to install & manage logstash, lumberjack |
9
0
1
|
||||
![]() |
puppet_logstashpuppet module to install & manage logstash, lumberjack |
9
|
0
|
1
|
|
movie_data_gen π¨βπ»π 8set of programs to generate random movie data set |
8
0
1
|
||||
![]() |
movie_data_genset of programs to generate random movie data set |
8
|
0
|
1
|
|
analytics-game-demo π¨βπ»π 6Analytics demo on game data |
6
0
1
|
||||
![]() |
analytics-game-demoAnalytics demo on game data |
6
|
0
|
1
|
|
puppet_zookeeper π¨βπ»π 6puppet module to install and manage apache zookeeper |
6
0
1
|
||||
![]() |
puppet_zookeeperpuppet module to install and manage apache zookeeper |
6
|
0
|
1
|
|
cm_api π¨βπ»π 1illustrates cloudera manager api use with ruby (httparty) |
1
0
1
|
||||
![]() |
cm_apiillustrates cloudera manager api use with ruby (httparty) |
1
|
0
|
1
|
|
index_logs π¨βπ»π 1Scala application to interface with solr for indexing and querying apache http log events |
1
0
1
|
||||
![]() |
index_logsScala application to interface with solr for indexing and querying apache http log events |
1
|
0
|
1
|
|
sinatra_delayed_job_active_record π¨βπ»π 3Simple project illustrating how to use Delayed_job active record with Sinatra |
3
0
0
|
||||
![]() |
sinatra_delayed_job_active_recordSimple project illustrating how to use Delayed_job active record with Sinatra |
3
|
0
|
0
|
|
play-cloudwickone-mongo-template π¨βπ»π 5Play MongoDB template |
5
0
0
|
||||
![]() |
play-cloudwickone-mongo-templatePlay MongoDB template |
5
|
0
|
0
|
|
cloudwick_one_template π¨βπ»π 1Bootstrap template for Cloudwick One |
1
0
0
|
||||
![]() |
cloudwick_one_templateBootstrap template for Cloudwick One |
1
|
0
|
0
|
|
docker-hadoop-worker π¨βπ»π 2Docker Hadoop Worker Image |
2
0
0
|
||||
![]() |
docker-hadoop-workerDocker Hadoop Worker Image |
2
|
0
|
0
|
|
docker-hadoop-master π¨βπ»π 2Hadoop Master Docker Image |
2
0
0
|
||||
![]() |
docker-hadoop-masterHadoop Master Docker Image |
2
|
0
|
0
|
|
puppet_module_mongo π¨βπ»π 13Puppet module to manage mongodb |
13
0
0
|
||||
![]() |
puppet_module_mongoPuppet module to manage mongodb |
13
|
0
|
0
|
|
sinatra_delayedjob_mongoid π¨βπ»π 3Example usage of sinatra with delayed_job and mongoid |
3
0
0
|
||||
![]() |
sinatra_delayedjob_mongoidExample usage of sinatra with delayed_job and mongoid |
3
|
0
|
0
|
|
deb-pkgs π¨βπ»π 13Build debian packages for some big data projects |
13
0
0
|
||||
![]() |
deb-pkgsBuild debian packages for some big data projects |
13
|
0
|
0
|
|
puppet_kafka π¨βπ»π 6puppet module to install kafka 0.8 |
6
0
0
|
||||
![]() |
puppet_kafkapuppet module to install kafka 0.8 |
6
|
0
|
0
|
|
puppet_storm π¨βπ»π 4puppet module to deploy storm |
4
0
0
|
||||
![]() |
puppet_stormpuppet module to deploy storm |
4
|
0
|
0
|
|
blog-engine π¨βπ»π 3a simple blog engine using mongo & sinatra |
3
0
0
|
||||
![]() |
blog-enginea simple blog engine using mongo & sinatra |
3
|
0
|
0
|
|
puppet_jmxtrans π¨βπ»π 3puppet module to install & manage jmxtrans |
3
0
0
|
||||
![]() |
puppet_jmxtranspuppet module to install & manage jmxtrans |
3
|
0
|
0
|
|
puppet_ganglia π¨βπ»π 3puppet module to install ganglia |
3
0
0
|
||||
![]() |
puppet_gangliapuppet module to install ganglia |
3
|
0
|
0
|
|
puppet_module_base π¨βπ»π 2puppet module to manage user's and install zsh(oh-my-zsh) for user |
2
0
0
|
||||
![]() |
puppet_module_basepuppet module to manage user's and install zsh(oh-my-zsh) for user |
2
|
0
|
0
|
|
puppet_scala π¨βπ»π 1puppet module to install scala |
1
0
0
|
||||
![]() |
puppet_scalapuppet module to install scala |
1
|
0
|
0
|
|
tana-readwise-exporter π¨βπ»π 1Export Readwise highlights to Tana |
1
0
0
|
||||
![]() |
tana-readwise-exporterExport Readwise highlights to Tana |
1
|
0
|
0
|
|
tana-readwise-exporter π¨βπ»π 1CLI to export readwise highlights to Tana.io |
1
0
0
|
||||
![]() |
tana-readwise-exporterCLI to export readwise highlights to Tana.io |
1
|
0
|
0
|
|
chef-repo π¨βπ»π 16base for chef code |
16
0
0
|
||||
![]() |
chef-repobase for chef code |
16
|
0
|
0
|
|
navi π¨βπ»π 1An interactive cheatsheet tool for the command-line |
1
10183
408
|
||||
![]() |
naviAn interactive cheatsheet tool for the command-line |
1
|
10183
|
408
|
|
slack-scala-client π¨βπ»π 1A scala library for interacting with the slack api and real time messaging interface |
1
180
110
|
||||
![]() |
slack-scala-clientA scala library for interacting with the slack api and real time messaging interface |
1
|
180
|
110
|
|
2
0
0
|
|||||
![]() |
2
|
0
|
0
|
||
awesome-bigdata π¨βπ»π 2A curated list of awesome big data frameworks, ressources and other awesomeness. |
2
10553
2383
|
||||
![]() |
awesome-bigdataA curated list of awesome big data frameworks, ressources and other awesomeness. |
2
|
10553
|
2383
|
|
docker-hadoop-base π¨βπ»π 1Hadoop base image for Docker built on centos 7 |
1
0
0
|
||||
![]() |
docker-hadoop-baseHadoop base image for Docker built on centos 7 |
1
|
0
|
0
|
|
1
0
1
|
|||||
![]() |
1
|
0
|
1
|
||
hadoop π¨βπ»π 16Set of Hadoop related UseCases |
16
0
1
|
||||
![]() |
hadoopSet of Hadoop related UseCases |
16
|
0
|
1
|
|
ankus π¨βπ»π 371DEPRICATED. Project maintained at https://github.com/cloudwicklabs/ankus |
371
0
1
|
||||
![]() |
ankusDEPRICATED. Project maintained at https://github.com/cloudwicklabs/ankus |
371
|
0
|
1
|
|
realtime_processing π¨βπ»π 6Set of UseCases for solving BigData RealTime Processing |
6
1
3
|
||||
![]() |
realtime_processingSet of UseCases for solving BigData RealTime Processing |
6
|
1
|
3
|
|
datagenerators π¨βπ»π 11[DEPRICATED] Use https://github.com/cloudwicklabs/generator |
11
0
2
|
||||
![]() |
datagenerators[DEPRICATED] Use https://github.com/cloudwicklabs/generator |
11
|
0
|
2
|
|
rpm-specs π¨βπ»π 14rpm spec files for various big data projects |
14
2
1
|
||||
![]() |
rpm-specsrpm spec files for various big data projects |
14
|
2
|
1
|
|
flume_filtering π¨βπ»π 2Filter HTTP log events based on status_codes using Interceptors and ChannelSelectors |
2
0
6
|
||||
![]() |
flume_filteringFilter HTTP log events based on status_codes using Interceptors and ChannelSelectors |
2
|
0
|
6
|
![]() |
|
hadoop copy a local file system folder to HDFS | |||||
![]() |
|
hadoop fs -ls results in "no such file or directory" | |||||
![]() |
|
How can I access S3/S3n from a local Hadoop 2.6 installation? | |||||
![]() |
|
namespace image and edit log | |||||
![]() |
|
Can Apache Sqoop and Flume be used interchangeably? | |||||
![]() |
|
What is the maximum container(s) in a single-node cluster (hadoop)? | |||||
![]() |
|
hadoop fs -text file returns "text: Unable to write to output stream." | |||||
![]() |
|
How to upload file to HDFS in Ubuntu | |||||
![]() |
|
Data lost after shutting down hadoop HDFS? | |||||
![]() |
|
Getting output of system() calls in Ruby | |||||
![]() |
|
Host and port to use to list a directory in hdfs | |||||
![]() |
|
Different ways to import files into HDFS | |||||
![]() |
|
relation between number of input splits and number of mappers in mapreduce hadoop | |||||
![]() |
|
Standard practices for logging in MapReduce jobs | |||||
![]() |
|
Accessing a file that is being written | |||||
![]() |
|
Opening a file stored in HDFS to edit in VI | |||||
![]() |
|
mapreduce in java - gzip input files | |||||
![]() |
|
MapReduce: How to get mapper to process multiple lines? | |||||
![]() |
|
how to start and check job history on hadoop 2.5.2 | |||||
![]() |
|
MapReduce or Spark for Batch processing on Hadoop? | |||||
![]() |
|
HBase: How does data get written in a sorted manner into HFile? | |||||
![]() |
|
Complete list of property that is used in Hadoop framework | |||||
![]() |
|
Where is the classpath set for hadoop | |||||
![]() |
|
Hadoop: Getting the input file name in the mapper only once | |||||
![]() |
|
MRv2 / YARN Features | |||||
![]() |
|
.sparkstaging directory in hdfs is not deleted | |||||
![]() |
|
Hadoop own data types | |||||
![]() |
|
How should I persist my event stream to cold storage? | |||||
![]() |
|
How to get data from HDFS? Hive? | |||||
![]() |
|
What is the -file argument for AWS EMR |