Analytics India Jobs offers a new platform for the data science community in India by bringing to the fore latest job openings across the country.
Here are the 5 most recent Data Science job openings across major cities in India :
Business Analyst @ Tredence, Bangalore
Tredence is the first analytics services company focused on the last mile of analytics adoption.
- At least 2 years of work experience
- Proficiency in SQL/Hive [must have]
- Handling and analysing large data
- Data cleaning and processing
- R (Basic data manipulation, DPLYR, GGplot, Random forest)
- Python (Basic data manipulation using pandas, matplotlib, sklearn)
- Hadoop, spark
Data Engineer @ Hugo Edge Solutions, Bangalore
As a data engineer, the candidate will analyze large amounts of information to discover trends and patterns, present information using data visualization techniques and design, develop and launch extremely efficient data pipelines to move data.
- 4-6 years of IT Development Experience
- Data engineering chops with big data technologies (AWS, Hadoop) and the drive to improve data quality, accuracy, and completeness
- Proficiency with handling structured and unstructured data sets, writing complex queries, and the occasional stored procedure
- Ability to design, build, and query data warehouses (MySQL, PostgreSQL, Oracle, MS SQL, Redshift)
- Experience in building or improving large scale data infrastructure from the ground up – Ever hear of Snowflake?
- AWS Certified Solution Architect or Developer Associate required; Professional certification preferred
- Experience with Agile and the Enterprise SDLC (functional and unit testing, etc)
AI Architect @ ZS, Pune
ZS‘s Software Development Team designs, implements, tests and supports high quality products used by hundreds of companies and thousands of end users to make critical decisions and manage sales operations.
- Knowledge and experience in some of the key AI platform, e.g. AWS Sagemaker, IBM Watson,
- Microsoft Azure, Google Api.Ai, Facebook Wit.Ai, Chatbots using Microsoft Bot Framework
- Serve AI/ML models in enterprise grade technology platforms through microservices
- Tensorflow, Caffe, CNTK, commercial technologies/platforms, etc
- Experience working in a DevOps environment, and using industry standard tools (GIT, JIRA,
- Teamcity, etc.). Able to explain technical concepts in a non-technical language
- Solid hands-on experience in the Artificial Intelligence platforms with understanding of the end to end life cycle of AI projects.
- Very solid proven hands-on experience on UI technologies (AngularJS, ReactJS, StencilJS etc), Java/ J2EE, Kubernetes, Spring framework, Microservices, REST API’s, Kafka etc.
- Exposure in Data Governance and management
- Exposure to Hadoop data science workbench solutions
- Preferred experience in any one of the integrated AI products like: Microsoft Azure ML, AWS ML
- Solid experience on any one of the ML pipeline solutions like DataiKU, Anaconda, KNIME and auto ML solutions like H2O.ai or DataRobot or Firefly.ai
Big Data Engineer @ Byte Prophecy Pvt. Ltd, Ahmedabad
The candidate will be designing databases and data pipelines for storing and processing large, sometimes-unstructured data-sets for use with our analytics platform. And, executing batch jobs on our custom-built computing cluster or any standard ETL tools or using custom code in SQL or Java or Python.
- Experience with both SQL (MySQL) and Columnar (MariaDB/InfiniDB), noSQL (Cassandra) databases
- Familiarity with programming best practices, design patterns, version control systems
- A sound understanding of parallel/distributed programming
- Good command in Java or Python
- The ability to work effectively with people from a variety of backgrounds
Senior Data Scientist @ Volvo Group, Bangalore
At Volvo, a Data Scientist would be expected to understand business requirements, KPIs and convert into analytical hypothesis in a structured and logical manner along with solution identification. Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions.
- Experience using statistical computer languages (Python, TensorFlow, SQL, etc.) to manipulate data and draw insights from large datasets.
- Strong conceptual understanding of machine learning algorithms including linear regression, logistic regression, decision trees, random forest, topic model etc.
- Experience working with and creating data architectures.
- Knowledge of a variety of machine learning techniques(regression, classification, clustering, time series, neural network, etc.)
- Knowledge in GLM/Regression, Random Forest, Boosting, Tress, Text mining, social network analysis, etc.
- Exposure to visualization tools specially Qlik Sense is a plus.
- Exposure to Analytics Platforms especially to Azure is a plus.