Hello Good Day!!
Please find the below Open Requirement with us and share resumes ASAP:
Big Data Architect
Milwaukee US
Exp: 12+ years
Description:
1. Exceptional understanding and hands on development experience in Hadoop Ecosystems (Map reduce, HDFS, Pig, Hive, Sqoop, Impala, Mahout, Oozie, Apache Spark and related tech stacks)
2. Facilitate requirement gathering interviews, conduct workshops, issue clarifications, review documentation, ensure traceability, sign off by all the stakeholders and own the change management to scope in order to define the project requirements (both functional & non-functional)
3. Build formal & informal customer relationships at the project level by engaging them regularly and be the SPOC to the customer on project related activities, provides regular reports, value articulation, managing expectations.
4. Lead the estimation activities considering org / unit goals, review the estimates and communicate to all the stakeholders in order to plan for required budget and resources to successfully execute the project
5. Experience in creation of Solution Architecture for client's Hadoop Data Lake
a. Lead workshops and discussions on requirements, architecture, design and translate all into right Big Data(HAdoop Solutions.
b. Understand Clients requirements and pain points and come up with a solution Architecture to address their requirements
c. Research the best components that can fit into the proposed architecture.
d. Conduct Proof of concept to validate the architecture if required
e. Revalidate the solution architecture based on the learnings from the POCs conducted.
f. Develop a road map for moving the solution from proof of concept to design, develop, testing and productionizing
6. Experience in setting up of cluster and cluster Architecture
7. Experience in sizing the clusters based on existing Data volume and volume growth.
8. Experience in interacting with product vendors like Horton Works or Cloudera.
9. Experience in designing data pipes using Oozie workflows
10. Experience in using sqoop/flumes to pull data from various source systems (Data Ingestion)
11. Experience in designing Hive managed tables and implementation
12. Experience in Hadoop Security Setup and Validation(LDAP/AD/etc)
13. Experience in Hadoop performance tuning(Benchmarking and query tuning)
14. Experience in Hadoop ecosystem Governance/Audit and Data Analytical tools like Tableau
Ravinder | SYSMIND, LLC | |
|
Phone: 609-897-9670 x 2177
Website: sysmind.com Address: 38 Washington Road, Princeton Junction, NJ 08550 |
You received this message because you are subscribed to the Google Groups "Jobs for USA consultants (Citizen G.C and H1B)" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jobs-for-usa-consultants-citizen-gc-and-h1b+unsubscribe@googlegroups.com.
To post to this group, send email to jobs-for-usa-consultants-citizen-gc-and-h1b@googlegroups.com.
Visit this group at http://groups.google.com/group/jobs-for-usa-consultants-citizen-gc-and-h1b.
For more options, visit https://groups.google.com/d/optout.
0 comments:
Post a Comment