Job description
Tiger Analytics is pioneering what AI and analytics can do to solve some of the toughest problems faced by organizations globally. We develop bespoke solutions powered by data and technology for several Fortune 100 companies. We have offices in multiple cities across the US, UK, India, and Singapore, and a substantial remote global workforce.
If you are passionate about working on business problems that can be solved using structured and unstructured data on a large scale, Tiger Analytics would like to talk to you. We are seeking an experienced and dynamic Solution Architect in Data Engineering to play a key role in designing and implementing robust data solutions that help in solving the client’s complex business problem.
Responsibilities
Manage large-scale data analytics client engagements in a global delivery model
Work with business executives to understand business requirements and constraints for the success of the program.
Present solutions to technology and business audiences highlighting the robustness of the solution and how it could help generate business value.
Ideating problem solutions leveraging modern trends in data and analytics solutions and related pattern
Hold discussions on high-level architecture, platforms, and tools fitments for data and analytics solutions
Steer discussions in strategizing and implementing digital and cloud solutions.
Collaborating with business / IT stakeholders and product managers to ideate software
12+ years of experience in Big Data, data warehouse, data analytics projects, and/or any Information Management related projects
Prior experience building large-scale enterprise data architectures using commercial and/or open-source Data Analytics technologies
Design and develop scalable Databricks-based architectures to process and analyze large-scale life sciences data.
Lead end-to-end data platform architecture, including ETL/ELT pipelines, data lakes, data warehouses, and data governance frameworks.
Implement Databricks on AWS, ensuring security, scalability, and performance.
Utilize Delta Lake, Unity Catalog, MLflow, and Databricks SQL for advanced analytics.
Integrate Snowflake, Redshift, Synapse, and other cloud-native data platforms where required.
Ability to produce client-ready solution architecture, business-understandable presentations and good communication skills to lead and run workshops
Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi and Kafka necessary to build and maintain complex queries, streaming, and real-time data pipelines
Data modeling and architecting skills including strong foundation in data warehousing concepts and experience with data warehouse solutions, such as Amazon Redshift, Google BigQuery, or Snowflake
Deep understanding of Life Sciences to deliver domain-specific solutions that address industry challenges.
Good knowledge in DevOps engineering using Continuous Integration/ Delivery tools
An in-depth experienceof Cloud solutions and experience in integrating into traditional hosting/delivery models
Strong project management and team management skills and ability to work with global teams
Understanding the application of project mgmt. and agile methodologies
Ability to work in a large matrixed organization and steer the path to success
Excellent communication, presentation, and interpersonal skills.
Significant career development opportunities exist as the company grows. The position offers a unique opportunity to be part of a small, fast-growing, challenging, and entrepreneurial environment, with a high degree of individual responsibility.