Login

Jobs at Factual

Hard problems. Diverse technology. Amazing culture.

Data Scientist

Shanghai

At Factual we love all things data! Our mission is to organize and optimize the world’s location information. Factual’s data set covers over 100 million places, 477 categories in 52 countries.

Our Data Team works on cleaning, structuring, and delivering our global places dataset. As a Data Scientist, you will have the opportunity to shape and influence the direction of our products and propel the growth of our business. You will collaborate with a diverse team of software engineers and data engineers to optimize our current machine learning models and pipelines, while also developing creative, data-based solutions. You will be working on many different projects ranging from using ML to score the popularity of a place at any given time to mapping anonymized user data to real world places.

About you:

You are passionate about using your programming, data wrangling, and analytical expertise to drive product and business decisions. You are a skilled communicator that will provide thought leadership on metrics, features, and products to teams across the company. You have strong engineering experience and enjoy working in a rapidly changing environment.

What you’ll do:

  • Build models to solve a wide range of problems: from predicting if a POI is closed to understanding anonymized user behavior in the real world
  • Design experiments and work with fellow engineers to ensure thoroughness and correctness on a variety of analyses
  • Use and commit to our data processing software
  • Author specification and lead technical projects
  • Propose creative strategies based on data-driven insights

What we’re looking for:

  • 2+ years of tech industry experience working on large scale data
  • Experience with maintaining production machine learning pipelines and using data science to solve business problems
  • A Bachelor degree in a quantitative field (Math, Statistics, Computer Science); advanced degrees are a plus
  • Strong coding experience (We use Python/scikit, Scala/Spark, Java/Hadoop)
  • Practices clean code and produces robust, portable workflows
  • Willingness to do dirty data processing and normalize data into a useful format
  • Excellent oral and written communication skills