What we want:
We are looking for a Big Data Developer with 6 months on intern experience to design develop and maintain scalable big data solutions. The role involves working with Hadoop ecosystems real-time and batch data processing frameworks and cloud-based platforms.
Who we are:
Vertoz (NSEI: VERTOZ) an AI-powered MadTech and CloudTech Platform offering Digital Advertising Marketing and Monetization (MadTech) & Digital Identity and Cloud Infrastructure (CloudTech) caters to Businesses Digital Marketers Advertising Agencies Digital Publishers Cloud Providers and Technology companies. For more details please visit our website here.
What you will do:
Responsible for the documentation design development and architecture of Hadoop applications.
Must have knowledge of Big Data technologies such as Impala Hive Hadoop Spark Spark streaming Batch Processing Kafka etc.
Excellent programming skills in Java Python and/or Scala. (anyone will be fine).
Must have knowledge of relational SQL and NoSQL databases including Vertica
Requirements
6 months experience working with Big Data technologies.
Strong knowledge of Hadoop ecosystem tools including Hadoop Hive Impala Spark Spark Streaming and Kafka.
Knowledge with batch and real-time data processing frameworks.
Proficiency in at least one programming language: Java Python or Scala.
Knowledge with stream-processing systems such as Spark Streaming Storm or Flume.
Good understanding of relational SQL and NoSQL databases including Vertica.
Basic to intermediate shell scripting skills.
Strong problem-solving skills and ability to work in a fast-paced environment.
Benefits
No dress codes
Flexible working hours
5 days working
24 Annual Leaves
International Presence
Celebrations
Team outings
Required Skills:
6 months experience working with Big Data technologies. Strong knowledge of Hadoop ecosystem tools including Hadoop Hive Impala Spark Spark Streaming and Kafka. Knowledge with batch and real-time data processing frameworks. Proficiency in at least one programming language: Java Python or Scala. Knowledge with stream-processing systems such as Spark Streaming Storm or Flume. Good understanding of relational SQL and NoSQL databases including Vertica. Basic to intermediate shell scripting skills. Strong problem-solving skills and ability to work in a fast-paced environment.
Required Education:
Graduate
What we want: We are looking for a Big Data Developer with 6 months on intern experience to design develop and maintain scalable big data solutions. The role involves working with Hadoop ecosystems real-time and batch data processing frameworks and cloud-based platforms.Who we are:Vertoz (NSEI: VERT...
What we want:
We are looking for a Big Data Developer with 6 months on intern experience to design develop and maintain scalable big data solutions. The role involves working with Hadoop ecosystems real-time and batch data processing frameworks and cloud-based platforms.
Who we are:
Vertoz (NSEI: VERTOZ) an AI-powered MadTech and CloudTech Platform offering Digital Advertising Marketing and Monetization (MadTech) & Digital Identity and Cloud Infrastructure (CloudTech) caters to Businesses Digital Marketers Advertising Agencies Digital Publishers Cloud Providers and Technology companies. For more details please visit our website here.
What you will do:
Responsible for the documentation design development and architecture of Hadoop applications.
Must have knowledge of Big Data technologies such as Impala Hive Hadoop Spark Spark streaming Batch Processing Kafka etc.
Excellent programming skills in Java Python and/or Scala. (anyone will be fine).
Must have knowledge of relational SQL and NoSQL databases including Vertica
Requirements
6 months experience working with Big Data technologies.
Strong knowledge of Hadoop ecosystem tools including Hadoop Hive Impala Spark Spark Streaming and Kafka.
Knowledge with batch and real-time data processing frameworks.
Proficiency in at least one programming language: Java Python or Scala.
Knowledge with stream-processing systems such as Spark Streaming Storm or Flume.
Good understanding of relational SQL and NoSQL databases including Vertica.
Basic to intermediate shell scripting skills.
Strong problem-solving skills and ability to work in a fast-paced environment.
Benefits
No dress codes
Flexible working hours
5 days working
24 Annual Leaves
International Presence
Celebrations
Team outings
Required Skills:
6 months experience working with Big Data technologies. Strong knowledge of Hadoop ecosystem tools including Hadoop Hive Impala Spark Spark Streaming and Kafka. Knowledge with batch and real-time data processing frameworks. Proficiency in at least one programming language: Java Python or Scala. Knowledge with stream-processing systems such as Spark Streaming Storm or Flume. Good understanding of relational SQL and NoSQL databases including Vertica. Basic to intermediate shell scripting skills. Strong problem-solving skills and ability to work in a fast-paced environment.
Required Education:
Graduate
View more
View less