Sorry, the offer is not available,
but you can perform a new search or explore similar offers:

Technical Support Representative - Virtual Interview

JOB RESPONSIBILITIES:Respond to incoming calls promptly and professionally.Make outbound calls to follow up on inquiries and resolve issues.Identify and reso...


From Meta Bpo - National Capital Region

Published 24 days ago

Application Developer ( Japanese Bilingual)

Application Developer - Japanese BilingualLocation: ManilaDuties/Responsibilities - Design, build and configure applications to meet business process and app...


From Neksjob - National Capital Region

Published 24 days ago

It Data Analyst - Paranaque

Atleast 3 years experience- Understanding of reports and data visualization using Business Objects, PowerBI and Tableau- Intermediate understanding on databa...


From Strategic Networks Inc - National Capital Region

Published 24 days ago

Security Risk Manager

"QualificationsInfosec Risk Management Officer""- Experience in IT general controls and auditing, preferably strong background on system security risk assess...


From Hunter'S Hub Inc. - National Capital Region

Published 24 days ago

Big Data Solution Architect

Big Data Solution Architect
Company:

Hunter'S Hub Inc


Details of the offer

The Big Data Solutions Architecture addresses specific big data problems and requirements. The Big data solutions architects will describe the structure and behavior of a big data solution and how that big data solution can be delivered using big data technology such as Hadoop. They must have hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning). The big data solutions architect is responsible for managing the full life-cycle of a Hadoop solution. This includes creating the requirements analysis, the platform selection, design of the technical architecture, design of the application design and development, testing, and deployment of the proposed solution.
Roles and Responsibilities:
Familiar of key concepts on distributed processing for Big data technologies and other related technologies on the field.
Experience of leading a team or perform a Proof of Concept (POC) and produce results and/or recommendations on how to use the technology in conjunction with the current applications.
Proven ability to find integration points with new technology and existing legacy technology.
Experience in designing solutions with fully redundant systems and DR processes / failovers
Ability to draft end to end solutions including logical and physical implementation designs
Have the flexibility to deploy cloud solutions; on-prem or on-cloud or in some cases, hybrid architecture or also virtualized implementations
Make suggestions on how to improve existing processes and reduce risks through redesigns
Explain how business requirements are satisfied by the proposed technological solution and explain this to the customers a/ main benefactor of the proposed solutions
Experiencing evaluating vendor proposal and give recommendations on the best approach to take

Minimum Qualifications
Key Technology Requirements:
Familiar with Data Warehousing concepts as EDS is mandated to perform key data warehousing functions
Knowledgeable and proficient with existing legacy technologies and their integration points

o Databases (Oracle, Teradata, Vertica etc.)
o Reporting Technologies (PowerBI, Tableau, Cognos etc.)
o ETL Technologies (Talend, Informatica, etc.)
Experience with a cloud technology (AWS, Google Cloud, Microsoft Azure)
Familiar with virtualization technologies (VMWare, Docker, Kubernetes)
Familiar with both batch and streaming processing technologies (Spark, MapReduce, Kafka, Flink)
In depth experience with API integrations
Knowledgeable about overall system security such as Kerberos and other security layer and tools (Ex. Cloud will have its own security implementations)
Familiar on redundancy software (Keep Alive, Load Balancer, HA Proxy etc.)
Proficient in Big data ecosystem and surrounding technologies (HDFS, YARN, Cloudera etc.)
Knowledgeable on machine learning platforms is a plus (how ML process works)
Firm understanding of major programming/scripting languages like Java, Linux, PHP, Ruby, Python and/or R.
5 to 8 years’ experience in a similar role

Qualifications and Accreditations:
Amazon Web Services (AWS) Certified Data Analytics – Specialty
Cloudera Certified Associate (CCA) Spark and Hadoop Developer


Source: Jora

Requirements

Big Data Solution Architect
Company:

Hunter'S Hub Inc


Built at: 2024-04-25T13:40:49.454Z