North Highlands CA 95660
Looking for Hadoop Architect position and utilizing my skills and experience in Hadoop, HDFS, Hive, Pig, HBase, Sqoop, MongoDB, Flume, Oozie, ZooKeeper, Ganglia, Nagios, Cloudera CDH3, CDH4, Horton works Ambari, AWS, Amazon EC2, S3, HiveQL, and PigLatin.
Expertise in web servers:
- IBM Web Sphere
- BEA Web Logic
Familiarity with programming languages:
Well-versed in reporting tools:
- SAS EBI Suite
Skilled in data warehousing:
Effectively worked with operating systems:
Adept in RDBMS:
- SQL server
Proficient with SDLC methodologies:
Specialist in ETL tools:
Axius Technologies Inc.
January 2012 – Present
- Administered and supported distribution of Horton works.
- Generated and updated reports and queries in Splunk.
- Formulated and executed requirements in SAS designs and code.
- Provided technical assistance during impact assessments and implementation.
- Prepared and maintained Java classes for AVRO file formats.
- Implemented code migration and participated in code reviews.
- Maintained and executed libraries such as Dozer and Reflection.
- Suggested latest upgrades for existing Hadoop cluster applications.
November 2010 – January 2012
Tysons Corner, VA
- Designed and developed J2EE web applications with Struts frameworks.
- Provided technical guidance during SDLC for Java enterprise applications.
- Participated in gathering and analysis of business and technical requirements.
- Developed and implemented mobile applications for Android devices.
- Implemented Big Data solutions and analyzed virtual machine requirements.
- Maintained detailed documentation of logical and software architecture.
- Formulated and executed designing standards for data analytical systems.
- Reviewed administrator processes and updated system configuration documentation.
Master’s Degree in Information Technology
Louisiana Tech University
August 2008 – May 2010
Cloudera Certified Administrator for Apache Hadoop (CCAH)