**Big Data Architect**
At Bank of the West, our people are having a positive impact on the world. We’re investing where we feel we can make the most impact, like advancing diversity and women entrepreneurship programs, financing for more small businesses, and promoting programs for sustainable energy. From our locations across the U.S., Bank of the West is taking action to help protect the planet, improve people’s lives, and strengthen communities. We are part of BNP Paribas, a global leader supporting the UN Sustainable Development Goals (SDGs). Yes, we’re a bank, but as the bank for a changing world, we are continually seeking to improve the ways we help our customers, while contributing to more sustainable and equitable growth.
We are looking for a Big Data/Hadoop Architect to support our existing systems and our strategic data hub project. The ideal candidate must have experience of at least 4+ years as Hadoop Administrator and 2+ years as Linux Administrator with proven hands-on experience in installation, configuration, supporting and managing Clusters with Horton works and Cloudera.
- Administration of Hadoop clusters, users, and cluster resources
- Cluster capacity planning and sizing
- Regular operation such as adding and removing nodes and troubleshooting any failures.
- Secure Hadoop cluster using Kerberos, Knox, and LDAP integration
- Setup security policies including ACLs and RBAC using Ranger, Sentry etc.
- Monitoring, tuning and improving Hadoop clusters to keep them healthy.
- Evaluating and enabling new Hadoop components
- Install and upgrade R, Python, Spark, Jupyter hub and other analytics tools and related packages.
- Integrate data sources such Oracle and SQL Server, and Enterprise tools such as OBIEE, and Tableau.
- Support user base through adequate resource managed on Hadoop clusters.
- Develop best practices and train users
- Deploy Hadoop components in Docker containers
- Working experience as a Hadoop Administrator for Cloudera or Hortonworks Hadoop ecosystem for at least 4 years.
- Deep understanding and strong conceptual knowledge in Hadoop architecture components.
- Strong hands-on experience and knowledge of Hadoop core components such as HDFS, YARN, Hive, Spark, Kafka etc.
- Hands-on experience and knowledge of Linux and Hardware.
- Strong hands-on programming shell scripting experience including bash and python.
- Strong analytical mind to help solve complicated problems.
- Desire to resolve issues and dive into potential issue.
- Self-starter who works with minimal supervision. Ability to work in a team of diverse skill sets.
- Ability to comprehend customer needs and requests and provide the correct solution.
- Any contribution to Hadoop open community (Experience as a Contributor or as a Committer in Hadoop community.)
- Experience on tools like Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, etc.
- Basic experience and knowledge of one of automation tools such as Chef, Puppet, Ansible.
- Hands-on programming experience in Python.
- Experience in end to end design and build process of Near-Real time and Batch Data Pipelines.
- Strong hands-on experience and knowledge of Linux and Hardware.
Bachelor’s Degree in Computer Science, Information Technology or Finance related field; or equivalent relevant experience
Equal Employment Opportunity Policy
Bank of the West is an Equal Opportunity employer and proud to provide equal employment opportunity to all job seekers without regard to any status protected by applicable law. Bank of the West is also an Affirmative Action employer - Minority / Female / Disabled / Veteran.
Bank of the West will consider for employment qualified applicants with criminal histories pursuant to the San Francisco Fair Chance Ordinance subject to the requirements of all state and federal laws and regulations.