Who we are

DoubleVerify is an Israeli-founded big data analytics company (Stock: NYSE: DV). We track and analyze tens of billions of ads every day for the biggest brands in the world.
We operate at a massive scale, handling over 100B events per day and over 1M RPS at peak, we process events in real-time at low latencies (ms) and analyze over 2.5M video years every day. We verify that all ads are fraud free, appear next to appropriate content, appear to people in the right geography and measure the viewability and user’s engagement throughout the ad’s lifecycle. 

We are global, with HQ in NYC and R&D centers in Tel Aviv, New York, Finland, Berlin, Belgium and San Diego. We work in a fast-paced environment and have a lot of challenges to solve. If you like to work in a huge scale environment and want to help us build products that have a huge impact on the industry, and the web - then your place is with us.

 

What will you do

You will join a team of experienced engineers and help them in developing our innovative measurements products.

You will lead projects by architecting, designing and implementing solutions that will impact the core components of our system.

You’ll develop new and awesome features while leveraging cloud native technology stack, do continuous improvements of our development process by adapting new technologies, and using them to solve product and engineering challenges while raising the bar of code quality and standards.

 

Who you are

  • 5+ years of experience coding in an industry-standard language such as Scala, Java, Python, Go etc.
  • Deep understanding of Computer Science fundamentals: object-oriented design, functional programming, data structures, multi-threading and distributed systems.
  • Experience with in-memory distributed cache such as Aerospike or Redis and messaging systems such as Apache Kafka, etc.
  • Experience working with Docker, Kubernetes and designing scalable microservices architecture.
  • Experience in working with SQL (MySQL, PostgreSQL) and Columnar/NoSQL Databases such as (BigQuery, Vertica, Snowflake, Couchbase, Cassandra, etc.).
  • Experience working in a BigData environment and building scalable distributed systems with stream processing technologies such as Akka Streams, Kafka Streams/Spark/Flink.
  • Experience working with cloud providers such as GCP or AWS
  • BSc in Computer Science or equivalent experience.
  • Experience with Agile development, CI/CD pipelines (Git ,GitLab or Jenkins) and coding for automated testing.
  • A versatile developer with a “getting-things-done” attitude.

 

Nice to have

  • Previous experience with online advertising technologies is a big plus.
  • Familiarity with the cloud-native computing foundation.

Apply for this Job

* Required
resume chosen  
(File types: pdf, doc, docx, txt, rtf)
cover_letter chosen  
(File types: pdf, doc, docx, txt, rtf)


Please reach out to our support team via our help center.
Please complete the reCAPTCHA above.