Summary
Tiger Analytics is seeking a remote Telematics Data Scientist with 8-12 years of consulting experience and 5 years in Telematics. The role involves applying AI, machine learning, and data mining to design advanced analytics engines. A Bachelor's Degree in Computer Science is required.
Requirements
- Bachelor’s Degree in Computer Science or closely related field
- 8 to 12 years of experience in Consulting
- 5 years of experience in Telematics
- Experience with R, Python, or similar programming languages
- Experience working in the areas of vehicle telematics, logistics, pattern detection in sensor or smartphone-connected data, and geospatial mapping
- Experience with the development of computational algorithms to reduce computation time (e.g. MapReduce)
Responsibilities
- Apply strong expertise in AI through the use of machine learning, data mining, and information retrieval to design, prototype, and build next generation advanced analytics engines and services
- Collaborate with cross-functional teams and business partners to define the technical problem statement and hypotheses to test
- Develop efficient and accurate analytical models which mimic business decisions and incorporate those models into analytical data products and tools
- Drive current and future strategy by leveraging your analytical skills as you ensure business value and communicate the results
Preferred Qualifications
- Deep expertise with relevant geospatial packages (e.g. geopandas and rasterio in Python; maptools, spdep, or OpenStreetMap)
- Experience with popular machine learning and deep learning frameworks (e.g. H2O, TensorFlow, PySpark, PyTorch, MXNet, Caffe)
- Understanding of the challenges in data harmonization and feature preparation from variety of 3rd party providers
- Experience with distributed storage and database platforms
- Understanding how the telematics data gets leveraged downstream for BI (trip completion) and analytics purposes (pricing)
- Experience working with weather and atmospheric data
- Experience with batch, micro-batch, streaming, and distributed processing platforms such as Flink, Hadoop, Kafka, Spark, Hudi, AWS EMR, Arrow, or Storm
- Experience working within Amazon Web Services (AWS) cloud computing environments
- Familiarity with containerization tools such as Docker and Kubernetes
Benefits
This position offers an excellent opportunity for significant career development in a fast-growing and challenging entrepreneurial environment with a high degree of individual responsibility