Design, implement and maintain a Datalake based on AWS technologies. This includes storing, cleansing, preparing and securing data to be used by various internal customers. Primary Responsibilities: Implement and maintain an “Infrastructure as Code” approach to deploying AWS resources in the Lake. This includes using technologies such as Git, CloudFormation, Terraform, AWSCLI and AWS CDK Develop workflows based on AWS DMS (Data Migration Service) that extract data from various sources including Oracle and populate the Lake. Develop workflows that transform data from one format to another using Glue and PySpark jobs. Experience with AWS EMR is equivalent. Recommend and implement approaches to secure the Datalake in general and implement a role based approach at the user level to allow only authorized users to access sensitive data. Develop approaches monitor the Datalake using technologies such as AWS Cloudwatch. Develop lambda and fargate programs to perform maintenance tasks on the Lake. Develop and maintain an AWS Aurora Postgres environment. This includes setting up Postgres and developing SQL. Evaluate the best type on columnar database to implement, this includes AWS Redshift and Snowflake. Develop Athena tables both in CSV and Parquet formats. Assist internal users with developing queries to drive Tableau and other 3rd party tools to access the Datalake via Athena and Postgres. Participates in recruiting, hiring, onboarding and performance management of new team members. Participates in special projects and performs other duties as assigned. Job Requirements: Knowledge and hands-on experience with the AWS data services including several of the following technologies: S3, CloudFormation, Terraform, AWSCLI, Redshift, EMR, Glue, Snowflake, Cloudformation, Postgres, Athena, fargate, RDS, AWS Aurora Experience working on a large scale data warehouse environment with thorough knowledge of database and data technologies in general. Excellent interpersonal skills, including the ability to work effectively with persons on all levels. Excellent troubleshooting skills under deadline pressure in a production environment. Knowledge of and ability to use database-modeling software such as Erwin. Strong record of project execution and completion with experience using Scrum and agile development practices. Excellent written, verbal, and interpersonal communication skills including the ability to interact with all levels of employees and customers throughout the organization. Education and Experience: Bachelor’s degree in Computer Science or a related discipline, or equivalent experience. Master’s degree preferred. A minimum of 7+ years’ experience as a software engineer or architect for large data projects such as data warehousing. Minimum 4+ years’ experience developing applications in the AWS environment. Experience using a variety of languages and technologies to develop web solutions. Experience working in an iterative or agile development environment, preferably Scrum.
Apply now to have the opportunity to be considered for similar jobs at leading companies in the Seen network for FREE.
Bloomberg BNA is a subsidiary of Bloomberg L.P. and a source of legal, tax, regulatory, and business information for professionals.
Open culture; employee assistance program; employee charitable contribution match program.
Technical phone interview; onsite day of technical interviews with team and senior managers. Average interview period: 2-3 weeks.
Zero stress and one profile that can connect you directly to 1000s of companies.
We’ll take it from there. After you tell us what you’re looking for, we’ll show you off to matches.
Boost your interview skills, map your tech career and seal the deal with 1:1 career coaching.
Join now and be seen.