Data Engineers here are involved in designing, developing and maintaining systems for data analysis, transformation, modelling and visualisation. We work directly with the data scientists to develop cutting edge uses of the data we collect.
As a Senior Engineer, you’ll be the bridge between software engineering and data science.
Helping bring new theories and inventions to life and drive data innovation within Tenable
Apply the latest real time streaming technologies to process billions of transactions per day
Work in a lean environment building out the Tenable data platform using both traditional data warehousing ideologies while incorporating the latest trends in big data
Grow your abilities within the data ecosystem, learn and teach other engineers and drive a data driven culture
Explore and work with the cutting-edge data processing technologies
Work in a lean environment building out a state-of-the-art data science platform
Develop and automate innovative algorithms for data analytics, machine learning and artificial intelligence
We’re an AWS house, so the chance to work with the latest and greatest tools from AWS
Make an impact on the future of Data Processing/Science and Cyber Security
Design and develop solutions that are core to the company values and future
What You’ll Need:
5+ years’ experience in a software engineering background within Java, Python or Scala
Strong experience in designing and engineering data processing workflows ideally using Apache Airflow
Strong experience with Big Data technologies such as Spark, EMR, Hadoop, Flink, Beam, Kafka, Hive, Presto, Impala, Atlas
Deep understanding of data storage technologies for structured and unstructured data
Experience with docker based containerization of software
Experience with a cloud-based architecture like AWS, GCS or Azure
Experience using Linux as a primary development environment
Strong drive for innovation matched with excellent communication and analytical skills
BSc or MSc in Computer Science, Data Science or directly related field
Expert in Airflow, Spark, EMR, Kinesis, Kafka
Expert in data Warehousing Modelling (Kimball/Inmon) and data lake design
Strong experience with docker and kubernetes
Strong experience with CI/CD technologies
Strong AWS DevOps skills like monitoring, deploying and setting up infrastructures
Experience working in Agile environments, Scrum/Kanban
Ability to switch between the Python ecosystem and JVM world seamlessly
Experience in Cyber Threat or Vulnerability related area
Apply now to have the opportunity to be considered for similar jobs at leading companies in the Seen network for FREE.
Zero stress and one profile that can connect you directly to 1000s of companies.
We’ll take it from there. After you tell us what you’re looking for, we’ll show you off to matches.
Boost your interview skills, map your tech career and seal the deal with 1:1 career coaching.
Join now and be seen.