Application Architect Resume
Phoenix, AZ
SUMMARY
- 10+ years of experience in the complete life cycle of software development — Big Data, Data warehousing and Mainframes, currently working as an Application Architect at Confidential
- Cloudera Certified Developer for Apache Hadoop (CCDH).
- Experience in implementing complete Hadoop solutions, including data acquisition, Data Streaming, storage, transformation, analysis and integration with other frameworks to meet business needs using Big Data/Hadoop technology stack that include Hadoop, Map Reduce, HDFS, Hive, Pig, Hbase, Oozie, Sqoop, Kafka, Flume, Storm, Spark, SparkSQL, Tez and YARN/MRv2
- Experience managing offshore teams, including software development, project management, QA, and release engineering.
- Expertise in Cloudera Hadoop environments (CDH 5.4.0, HUE UI). Experience in importing/exporting of analyzed data into/from Traditional Database like Teradata/Oracle RDBMS using Sqoop.
- Proficient in AWS RDS/EC2 instance, Ab Initio, Teradata, DB2, PostgreSQL, UNIX, COBOL, SAS, Eztrieve and JCL
- Experience in enterprise application performance, scalability strategies and best practices.
- Excellent leadership, team building, analytical, problem - solving, written, communication and interpersonal skills.
- D efined solutions to identify the data issues in the operational data store for credit card applications and improve the data quality through multiple re-design of the existing platform.
- Improved the performance of the Credit card Collection dialer process by simplifying the design and process in different platform thereby reducing the time to deliver from 6 hours to 2 hours. This greatly enhanced the efficiencies in the collection dialer process by providing data to dialer agents on time and resulted in prompt calling of customers.
- Forged strong partnerships with the business teams in both agile and traditional methodology to understand and refine the business requirements and provide appropriate architectural and technology solutions.
- Consistently delivering more than 5 projects simultaneously for the last 2 years spanning multiple teams and multiple technologies.
TECHNICAL SKILLS
BiG Data Skills: Hadoop, Map Reduce, HDFS, Pig, Hive, Hbase, Oozie, Sqoop, Kafka, Flume, Storm, Zoo keeper and Spark
Programming languages: Java, Spring Batch framework
Platforms: UNIX, Windows, OS390,AWS(EC2/RDS)
ETL / Testing Tools: Ab Initio GDE (1.15/3.1), Ab initio Co-op (2.15/3.1), Data Stage, UNIX Shell Scripting, SQL
Mainframe Skills: Cobol, Eztrieve, JCL, VSAM, CICS, Rexx, Native SQL and SAS DB2 v10, Teradata v12, Mongo DB, PostgreSQL Control Mscheduler, OPC and CA7 Agile, Scrum, Waterfall
Database: IBM data studio, MS Visio, Teradata SQL assistant, DbVisualizer
Scheduling Tools: Ab initio Webeme, putty, Changeman, File Manager, Github, PgAdmin
PROFESSIONAL EXPERIENCE
Application Architect
Confidential, Phoenix, AZ
Responsibilities:
- Worked on the Data Ingestion framework for storing the Amex Travel data, GL data and Customer data in to Hive and the framework uses Mapreduce, Java, Pig and Hive tools. Data ingestion framework uses audit mechanism, error logging, PGP security, Tokenization of data, data stored is used for generating reports and analytics using BIRST tool.
- Working on the DTR Real time data ingestion process to transmit the OTA xml using Kafka producer, processing the xml at segment level using Storm, parsed segments are stored in HDFS/Hive and Managing real time updates using Hbase.
- Working on the PRDS project another data source which extracts the data from oracle database, SFTP the file to Hadoop cluster edge node, creating the reporting tables for reporting and analytics using BIRST tool.
Environment: Agile-Scrum, Hortonworks 2.3, Mapreduce, Hive, Pig, Sqoop, Hbase, Oozie, Spark, Kafka, Storm, Flume, Unix, SVN, MySQL, Yarn, REST and Java.
Technical Lead
Confidential, Richmond, VA
Responsibilities:
- Worked on the Card collections project for storing credit cards data in Hadoop HDFS and analyze the data using Mapreduce, Sqoop, Pig and Hive tools. Worked on the new Initiative to build Mongo DB tables for the Long Term Payment Relief project and loading the data from the HDFS files.
- Worked on the Mobile Field Dialing project to build online and batch process using Spring Batch and PostgreSQL database.
- Delivered multiple programs and projects while leading a team of multiple resources across geographies and across technologies like Ab Initio - ETL development, Operational Data Store - DB2, Card Data Warehouse - Teradata, and Mainframe developers. Lead and deliver implementation of multiple redesign effort’s to resolve latency issues, maintain data Hygiene in operation Data stores and improve response times for store procedures.
- Created high level designs in adherence with the architectural roadmap for the program and also ensure adherence to the Enterprise Architecture’s Destination Target Architecture.
Environment: Agile-Scrum, CDH 5.4, Mapreduce, Hive, Pig, Sqoop, Hbase, Oozie, Flume, Unix, GitHub, Jenkins, CICD(Continuous Integration Continuous Deployment), Control-M, PostgreSQL, Java Spring batch, ETL-Ab Initio, Db2, Teradata and Mainframes.
Senior Developer
Confidential
Responsibilities:
- Created high level designs in adherence with the architectural roadmap for the program and also ensure adherence to the enterprise architecture Destination target architecture.
- Lead a team of resources across geographies and technologies like Ab Initio, Operational Data Store - DB2, and Mainframe developers and ensure project stays on track.
- Created re-usable design patterns and reusable code artifacts in support of the design patterns to drive time to market efficiencies .
- Providing innovative trade off solutions to ensure the team is able to provide optimal business functionalities across the multiple programs and meet timeline for multiple programs.
- Created implementation strategy for ODS project rollouts to minimize the incidents and outages.
Environment: Agile-Scrum, ETL-Ab Initio, Web EME, Db2, Teradata, Unix Scripting, Changeman, File aid, Control-M, IBM Data studio, Native SQL Stored procedures, COBOL, JCL and CICS.
Senior Software Engineer
Confidential
Responsibilities:
- Understanding Merrill lynch and Confidential Wealth management System and PATH (Profiling and Asset Tracking of Households) —Merrill System.
- Preparation of High level Design, Coding, Test plan and Unit Testing with cross platform technologies involving ETL- Data Stage, Mainframes- COBOL, JCL, VSAM and CICS.
- Review of Low level design, coding, provide System testing support and Production support.
Environment: ETL-Data stage, Db2, Unix Scripting, Endeavor, File aid, Control-M scheduler, COBOL, JCL and CICS.
Senior Systems Engineer
Confidential
Responsibilities:
- As a Lead developer worked on multiple projects, Auto Policy Processing System (APPS) enhancement using online Customer Information Control System (CICS) that manages the Automobile Insurance of the Confidential Group.
- Preparation of Design, Unit test data and testing.
Environment: Db2, Unix Scripting, Changeman, File aid, SPUFI, COBOL, JCL, Eztrieve, VSAM and CICS.
Senior Systems Engineer
Confidential
Responsibilities:
- Created Detail design for Modeling and mapping ADABAS to Relational databases like Db2, Oracle using relational tool for RAD project.
- Performed Analysis, Coding and Testing (NATURAL/ADABAS) for extraction of data for Materialization/Propagation process.
- Preparation of Design, Unit test data and testing.
Environment: SDLC-Waterfall, Db2, Changeman, File aid, SPUFI, NATURAL/ADABAS, COBOL, JCL, Eztrieve and VSAM.
Software Engineer
Confidential
Responsibilities:
- Understanding and Analysis of Client Requirements, preparation of Detail design, Coding and Unit testing.
- Built COBOL programs, JCL code to store the data in DB2 warehouses in the form of facts and dimensions.
Environment: SDLC-Waterfall, Db2, Changeman, File aid, SPUFI, COBOL, JCL, CICS and VSAM.