DE Jobs

Search from over 2 Million Available Jobs, No Extra Steps, No Extra Forms, Just DirectEmployers

Job Information

FETCH REWARDS INC Big Data Engineer in MADISON, Wisconsin

JOB REQUIREMENTS: Fetch Rewards, Inc. seeks a Big Data Engineer based in Madison, Wisconsin. The job duties for the Big Data Engineer position include: Model and tune real-time database (Redis, Pinot, Hudi, Iceberg, RDS, etc.) and query performance utilizing best practices. Enable all stakeholders to be able to access and use endless amounts of data that come from an ever-growing variety of data sources using tools like Kafka, Trino, Pinot, Grafana, custom solutions, and easy-to-understand documentations. Lead data documentation and data discovery initiatives using Lucid chart, Miro, and DataHub. Deploy scalable Kubernetes infrastructure to host applications using Docker, Kubernetes, Helm, Argo, AWS EKS, ECR, etc. Build performant and scalable pipelines to quickly move data from backend to data lake using Kafka, Flink, S3, Hudi, Iceberg, Argo, Airflow, Spark and Terraform. Monitor and maintain alerting and telemetry around pipeline tooling to ensure data gaps are addressed as quickly as possible using tools like Grafana, Pager Duty, Prometheus, and custom scripts. Write applications to quickly encrypt and obfuscate data in real time using Kafka Connect, and AWS KMS, and custom scripts. Design data transformation patterns to improve transparency and usability of data using Flink SQL, Trino, and Pinot. Stand up highly available and performant real-time OLAP datastores to ingest streams using tools like Pinot, Hudi, and Iceberg. Establish technical standards and norms for real-time data ingestion, aggregation, monitoring, and data contract. Work closely with other technical leads to develop and staff cross-guild initiatives with teams from Fetch\'s other specialized technical departments. To apply: send resume to resumes-hr5@fetchrewards.com. Please reference job title when applying. ***** OTHER EXPERIENCE AND QUALIFICATIONS: Minimum requirements: Master\'s degree in Computer Science, Business Intelligence & Analytics, Data Science, or a related field plus 2 years of experience with handling, modeling, and moving data. In addition, the following specific skills are required: 2 years of experience reading, writing, debugging and deploying code in Python and Java. 2 years of experience with moving high volume and high throughput data (terabyte/day with 1,000 events/s) with tools like Kafka or Kinesis. 2 years of experience translating business requirements into technical criteria and scope. 1 year of experience building and interacting with real-time OLAP data stores and their associated infrastructure such as HDFS, Hudi, InfluxDB, and Pinot. ***** APPLICATION INSTRUCTIONS: E-Mail a Rsum: resumes-hr5@fetchrewards.com

DirectEmployers