Sorry, the offer is not available,
but you can perform a new search or explore similar offers:

Urgently Need Of Java Developers(Wfh)Earn 45K/Month

"Qualifications for Java Developer:Professional IT Certification, preferred1-3 years of experience developing appswith springboot experienceExpert level in J...


From Neksjob Philippines - National Capital Region

Published a month ago

Urgently Need Of Java Developers(Wfh)Earn 45K/Month

"Qualifications for Java Developer:Professional IT Certification, preferred1-3 years of experience developing appswith springboot experienceExpert level in J...


From Neksjob Philippines - National Capital Region

Published a month ago

Get Hired, Apply Now! Telco Account Willing To

Customer Service Representative l Telecommunications Account l Alphaland Makati SiteCompensation: 16,000 - 19,500 (+3,000 to be added to basic upon graduatio...


From Neksjob Philippines - National Capital Region

Published a month ago

German Client Support Analyst

Requirements:- Fluent in German and English language- Must be open for shifting schedule- Amenable to work in Dumaguete or Paranaque- Can start as soon as po...


From J-K Network Services - National Capital Region

Published a month ago

Big Data Solution Architect

Big Data Solution Architect
Company:

Hunter'S Hub Inc


Details of the offer

The Big Data Solutions Architecture addresses specific big data problems and requirements. The Big data solutions architects will describe the structure and behavior of a big data solution and how that big data solution can be delivered using big data technology such as Hadoop. They must have hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning). The big data solutions architect is responsible for managing the full life-cycle of a Hadoop solution. This includes creating the requirements analysis, the platform selection, design of the technical architecture, design of the application design and development, testing, and deployment of the proposed solution.
Roles and Responsibilities:
Familiar of key concepts on distributed processing for Big data technologies and other related technologies on the field.
Experience of leading a team or perform a Proof of Concept (POC) and produce results and/or recommendations on how to use the technology in conjunction with the current applications.
Proven ability to find integration points with new technology and existing legacy technology.
Experience in designing solutions with fully redundant systems and DR processes / failovers
Ability to draft end to end solutions including logical and physical implementation designs
Have the flexibility to deploy cloud solutions; on-prem or on-cloud or in some cases, hybrid architecture or also virtualized implementations
Make suggestions on how to improve existing processes and reduce risks through redesigns
Explain how business requirements are satisfied by the proposed technological solution and explain this to the customers a/ main benefactor of the proposed solutions
Experiencing evaluating vendor proposal and give recommendations on the best approach to take

Minimum Qualifications
Key Technology Requirements:
Familiar with Data Warehousing concepts as EDS is mandated to perform key data warehousing functions
Knowledgeable and proficient with existing legacy technologies and their integration points

o Databases (Oracle, Teradata, Vertica etc.)
o Reporting Technologies (PowerBI, Tableau, Cognos etc.)
o ETL Technologies (Talend, Informatica, etc.)
Experience with a cloud technology (AWS, Google Cloud, Microsoft Azure)
Familiar with virtualization technologies (VMWare, Docker, Kubernetes)
Familiar with both batch and streaming processing technologies (Spark, MapReduce, Kafka, Flink)
In depth experience with API integrations
Knowledgeable about overall system security such as Kerberos and other security layer and tools (Ex. Cloud will have its own security implementations)
Familiar on redundancy software (Keep Alive, Load Balancer, HA Proxy etc.)
Proficient in Big data ecosystem and surrounding technologies (HDFS, YARN, Cloudera etc.)
Knowledgeable on machine learning platforms is a plus (how ML process works)
Firm understanding of major programming/scripting languages like Java, Linux, PHP, Ruby, Python and/or R.
5 to 8 years’ experience in a similar role

Qualifications and Accreditations:
Amazon Web Services (AWS) Certified Data Analytics – Specialty
Cloudera Certified Associate (CCA) Spark and Hadoop Developer


Source: Jora

Requirements

Big Data Solution Architect
Company:

Hunter'S Hub Inc


Built at: 2024-05-05T05:19:05.095Z