Day to Day Responsibilities:
Using Big Data tools (Hadoop, Spark, AWS) to conduct the analysis of petabytes of AWS usage data and other data.
Drive the engineering of security and fine-grained access controls in our emerging data platforms like AWS EC2, HDFS.
Writing software to clean and investigate large data sets of numerical data.
Integrating with external data sources and APIs to discover interesting trends.
Designing rich data visualizations to communicate complex ideas to customers or company leaders using Tableau or other tools.
Investigating the impact of new technologies on the future of digital banking and the financial world of tomorrow.
Work directly with Product Owners and end-users to develop solutions in a highly collaborative and agile environment.
The Ideal Candidate Will Be
You know how to programmatically extract data from a database and an API, bring it through a transformation or two, and model it into human-readable form (ROC curve, map, d3 visualization, Tableau).
Creative. Big, undefined problems and petabytes of data don't frighten you. You're used to working with abstract data, and you love discovering new narratives in unmined territories.
The ability to own the application end to end and act like an owner.
At least 4 years of experience with Python, Java, or Scala
At least 2 years of experience in open source programming languages
At least 1-year experience working with AWS
At least 3 years of experience with SQL
At least 1 year of experience with Spark or Hadoop
At least 1 year of experience with Kafka, Tableau or Databricks
Apply now to have the opportunity to be considered for similar jobs at leading companies in the Seen network for FREE.
Zero stress and one profile that can connect you directly to 1000s of companies.
We’ll take it from there. After you tell us what you’re looking for, we’ll show you off to matches.
Boost your interview skills, map your tech career and seal the deal with 1:1 career coaching.
Join now and Be Seen.