We provide IT Staff Augmentation Services!

Senior Architect Resume

4.00/5 (Submit Your Rating)

PROFESSIONAL S UMMARY:

  • IT professional with over 16 years of experience in a database production environment specializing in project management, analysis and architecture of client information with a focus on Infrastructure architect, DevOps, Big Data, Hadoop and data warehousing.
  • A business - focused Senior Enterprise Architect with 8+ years of digital strategy consulting with private and public sector leaders to simplify business complexities, design future state IT solutions, create change and transformation strategies, and prepare implementation plans to deliver and drive adoption of enterprise content.
  • Lead and own accountability for the design of business and IT vision, business architectures, data architectures, application functionality, solution architectures, Infrastructure and DevOps architect, implementation strategies, project planning and presentations.
  • Create and present clear, visually illustrative, decision options with trade-offs for future state business functional requirements, solution blueprints, and application designs to drive understanding and enable rapid project decisions.
  • Lead and drive rapid brainstorming sessions to quickly assess current state complexities in business, data, application, and strategy to identify architectural challenges and automation opportunities for future state IT solutions.
  • Hands on architect experience in configuring system infrastructure on AWS, Azure and CenturyLink cloud, Hadoop, Jenkins, Kafka, Spark, MapReduce, HDFS, ZooKeeper, Oozie, Hive, Sqoop, Pig, Spark, Flume, Java, python, ETL methodology and no SQL database MemSQL, Hbase, Mongo DB, Cassandra.
  • Profound knowledge of Cloud, DevOps, project management, release management, software engineering SDLC, STLC and Good communication, client interaction, client presentation, problem solving and interpersonal skills.

TECHNICAL SKILLS:

Cloud Computing Services: Amazon AWS, Microsoft Azure, CenturyLink

DevOps: Chef, Ansible, Puppet, Jenkins, Docker, AWS Elastic beanstalk, AWS Cloudformation, Code pipeline and code deploy, BuildForge, Ant, Anthill Pro, Udeploy, rpm packages, Azure Service Fabric

Big Data Tools: Hadoop 2.2/1.2, map reduce, HDFS, Hive, Pig, Impala, Strom, Flume, Kafka, Scoop, Spark 1.0, YARN, ZooKeeper, Oozie, Splunk 6.0, Logstash, Kibana, Hue, Drill, AWS Kinesis, AWS EMR, Elastic search, Ambari, Cloudera manager, Kafka cluster manager, MemSQL, Kibana, Logstash

No SQL Database: Hbase-0.94.1, Cassandra 2.1.9/1.2.9 , Mongo DB 3.0

Relational Database: Oracle 11g/10g/9i/8i, Sybase, MS SQL Server 2005/ 2008/2012 , DB2, MY SQL 5.1/5.5, Teradata 12

Programming: Java, Python, Scala, Shell, Perl and batch scripting, SQL, PL/SQL, T-SQL, C, C++

SCM Tools: ClearCase, GITHUB, BitBucket, Perforce

ETL Tools: Informatica 9.1/8.5/7.1.3/6.2 , Pentaho Kettle (PDI) 5.0/4.1/3.5, Clover 4.1, Talend Data Integrator5.6/6.0, SSIS, IBM Infosphere DataStage 9.1, AB Initio 3.1

Data Modeling: Physical, Logical, Relational and Dimensional Modeling, ER Diagrams, Erwin 4.0/3.5.2.

Operating System: Windows 7, 2012, 2000/98/XP/NT, Unix (Sun Solaris and AIX), Linux

Miscellaneous Tools: TOAD, Rapid SQL,SQL Developer, SQL*Plus, SQL Server Query Analyzer, Tableau, Autosys, Control-M, Quality Center, Alarm Point, Service Center, Pace, RTC (Rational Team Concert), MS Office, SSMS, Nagios, Ganglia, Dynatrace, DataDog, PagerDuty

Management: Architect, Team Lead, Project Management, Release management, Agile, Scrum

PROFESSIONAL EXPERIENCE:

Confidential

Senior Architect

Responsibilities:

  • Created and implemented best practices for source code management, build/release application and infrastructure as service code management using AWS.
  • Responsible for cloud architecture evaluation and implementation.
  • Collaborated with multiple location based DevOps teams to lead new projects on Amazon Cloud Services. Produced business-focused architecture and solution design with technologies including virtualization/cloud computing, web, Application and database servers, storage, networking and security.
  • Acted as a mentor for development team, assisted developers and DevOps engineers in all aspects of the software life-cycle, including: definition, design, implementation, testing and delivery, assigned and prioritized project related task.
  • Designed end to end CI/CD pipeline using Jenkins.
  • Responsible for Designing roadmap for analyzing system log, sudo logs, Application and operational logs using Splunk/Hadoop/HDFS system. ( MapR Hadoop distribution)
  • Architect system to collect server logs using Kafka and transfer data from Kafka to HDFS for historical data analysis. Provide data analytics solution using Hunk.
  • Responsible for conceptualizing and designing end-to-end Solution Architecture blueprint for current infrastructure upgrades and future state technical capabilities for Confidential Big Data platform.
  • Transformed complex manual efforts into simple, automated, data-driven, user experiences for better decision-making and competitive advantage.
  • Shared deep content and data management lifecycle and technology solution knowledge to grow internal understanding of what's possible with new and future technologies and brought in technology vendors to present demos of leading edge content management and data management systems for core team.
  • Created environment on AWS platform, AWS Hadoop EMR cluster and implemented system alerts on DataDog and pagerduty.

Environment: Hadoop, HDFS, AWS, Flume, Kafka, Spark, Java, Splunk, Bit Bucket, DataDog, Azure

Confidential, Burlington, MA

Senior Architect\Technical Implementation Manager

Responsibilities:

  • Presented technology and infrastructure decision options in transformation from legacy systems to automated, situational intelligence, and Cloud services infrastructure solution using Azure.
  • Managed a team of senior engineers to implement the key components of the product according to specifications, defined and enforced Big Data/Kettle best practice, SCM, Dev Ops and Infrastructure (Iaas) strategy across organization and agile technology across team.
  • Designed roadmap for the use of Hadoop/HDFS system with MapReduce and created strategies for optimization of cluster utilization and implementation.
  • Architect and implemented predictive model to analyze client behavior using Hadoop, Flume, Java, Pentaho and Hive. Loaded highly unstructured and semi structured data on HDFS, implemented map reduce jobs using Java and pentaho kettle tool.
  • Articulated service visions, alignment of information technologies with enterprise strategy, and shared common solutions and best practices.
  • Authored complex multiyear statement of work and implementation plan for each client to engage professional service team to build out solutions.
  • Lead and facilitated internal understanding of data management lifecycles including the automation of data aggregation, data ingestion, data cleansing, data blending and data profiling to provide business insight.

Environment: Hadoop, Hive, Flume, Sqoop, Pig, Java, Microsoft Azure, AWS, Mongo DB, SQL Server 2012/2008/2005 , Kettle (PDI) 5.0, Windows batch scripts, Perl Scripts, Windows 2012 Server, Unix, Shell Scripting, Oracle 11, perforce

Confidential, Merrimack, NH

Architect\System Project Manager\ Principal Software Engineer

Responsibilities:

  • Integrated Hadoop into traditional ETL, accelerating the extraction, transformation, and loading of massive structured and unstructured data.
  • Installed and configured Cloudera based Hadoop systems, developed centralized Service setup to start, stop, and expand data nodes. Involved in Capacity planning of Hadoop cluster.
  • Provided technical leadership in managing, designing and moving current application to cloud infrastructure system solution using AWS.
  • Defined and enforced project processes and policies, AWS cloud best practice across organization. Introduced the use of advanced cloud technology such as AWS EMR, Code Pipeline, Cloud Formation, Cloud watch, Cloud Front, EC2, S3, glacier, EBS and RDS service.
  • Created road-map for BigData, ETL and SCM/Release Engineering technology across organization. Lead and managed Infrastructure team and database development team, helped to formulate estimates and timelines for project activities and setting related goals.
  • Designed map reduce process using Informatica and Java/Python to load data into HDFS, architect ETL jobs to load trading data into Cassandra database and transported Cassandra database data into data warehouse.
  • Designed a real time computation engine using Flume, Kafka, Spark and complex event processing engine to provide the credit card customer with a Credit Line Increase.
  • Designed, architect and delivered multiyear complex data warehousing key initiative using Java, shell, perl scripts, XML database schema, SQL/PL SQL, T-SQL.

Environment: Hadoop, HDFS, Amazon AWS EC2, Hbase, Informatica 9.1/8/7/6, Oracle 11.2, 10g, Sybase, Java, Shell Scripting, Perl scripting, Autosys, Control-M, UNIX, GITHUB

Confidential

Project Manager/Mentor/Lecturer

Responsibilities:

  • Designed data warehousing system for Confidential students.
  • Worked on designing online student registration process for Confidential .
  • Mentored students for university Java, C++, database, SQL and PL/SQL research project.

We'd love your feedback!