Bigdata Recruitment | Software Engineer


Table of Contents

Bigdata off- campus recruitment drive to hire candidates  for  Software Engineer. Interested candidate can read the  details and apply as soon as possible.

About: Big data is a field that treats ways to analyze, systematically extract information from, or otherwise deal with data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many fields (columns) offer greater statistical power, while data with higher complexity (more attributes or columns) may lead to a higher false discovery rate.Big data analysis challenges include capturing data, data storage, data analysis, search, sharing, transfer, visualization, querying, updating, information privacy, and data source. Big data was originally associated with three key concepts: volume, variety, and velocity The analysis of big data presents challenges in sampling, and thus previously allowing for only observations and sampling. Therefore, big data often includes data with sizes that exceed the capacity of traditional software to process within an acceptable time and value.

Position: Software Engineer

Location: Chennai


  • Experience of test-driven development alongside the use of automated test frameworks, mocking and stubbing and unit testing tools.
  • Knowledge of the key phases of software delivery lifecycle and established software development methodologies.
  • Experience of working in an environment where products must be delivered to specific timescales.
  • An understanding of how to translate product and business requirements into technical solutions.
  • The ability to understand and support, modify and maintain systems and code developed by other engineering teams.
Job Description:
  • Design and develop reusable libraries and application programming interfaces for use across the bank.
  • Design and develop software that is amenable for a greater automation of build, release testing and deployment process on all environments.
  • Support the reuse and sharing of platform components and technologies within the software engineering teams.
  • Deliver software components to enable the delivery of platforms, applications and services.
  • Write unit and integration tests, in automated test environments to ensure code quality.


  • 3 to 15 years of expertise in data and analytics concepts. Big Data, Hadoop, Hive, Impala, Sqoop, oozie (Pyspark, Athena, Presto, Airflow, CLI, Glue) Scala, Cloud concepts.
  • Must have strong understanding of data architecture, data integration and data management processes, ability to develop transformation logic in Spark-SQL based on dataframes.


Please enter your comment!
Please enter your name here