You can watch the replay of this Geek Sync webinar in the IDERA Resource Center: http://ow.ly/pg7N50A4svf.
Today's data management professional is finding their landscape changing. They have multiple database platforms to manage, multi-OS environments and everyone wants it now.
Join IDERA and Kellyn Pot’Vin-Gorman as she discusses the power of auto deployment in Azure when faced with complex environments and tips to increase the knowledge you need at the speed of light. Kellyn will cover scripting basics, advanced Portal features, opportunities to lessen the learning curve and how multi-platform and tier doesn't have to mean multi-cloud.
Attendees can expect to learn how to build automation scripts efficiently, even if you have little scripting experience, and how to work with Azure automation deployments. This session will allow you to begin building a repository of multi-platform development scripts to use as needed.
About Kellyn: Kellyn Pot’Vin-Gorman is a member of the Oak Table Network and an IDERA ACE and Oracle ACE Director alumnus. She is the newest Technical Solution Professional in Power BI with AI in the EdTech group at Microsoft. Kellyn is known for her extensive work with multi-database platforms, DevOps, cloud migrations, virtualization, visualizations, scripting, environment optimization tuning, automation, and architecture design. She has spoken at numerous technical conferences for Oracle, Big Data, DevOps, Testing and SQL Server. Her blog, http://dbakevlar.com and social media activity under her handle, DBAKevlar is well respected for her insight and content.
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
Geek Sync | Deployment and Management of Complex Azure Environments
1. Deployment and Management of
Complex Azure Environments
Kellyn Pot’Vin-Gorman
Data Platform Architect at Microsoft
in Analytics and AI
Former DevOps Engineer and Multi-
Platform DBA
Wed, Nov 14, 2018 9:00 AM Webinar
2. Kellyn Pot’Vin-Gorman
Data Platform Architect at Microsoft, Power BI and AI
• Former Technical Intelligence Manager, Delphix
• Multi-platform DBA, (Oracle, MSSQL, MySQL, Sybase,
PostgreSQL, Informix…) and DevOps Engineer
• Oracle ACE Director, (Alumni)
• Oak Table Network Member
• Idera ACE Alumni 2018
• STEM education with Raspberry Pi and Python, including
DevOxx4Kids, Oracle Education Foundation and TechGirls
• Former President, Rocky Mtn Oracle User Group
• President, Denver SQL Server User Group
• DevOps author, instructor and presenter.
• Author, blogger, (http://dbakevlar.com)
3. Agenda
The Power of the DBA in Azure
Scripting Best Practices
Automation
Use Case Example and Demo
Advanced Portal Features
Tips to Learning Faster
4. DBA Evolution
• De-emphasis on traditional database tasks-
• Backup/recovery
• MAA architecture
• Simple deployment
• End-user management
• Increased Emphasis On-
• Automation
• Virtualization
• Data Modeling
• Scripting
5. A Few Facts
1.7Mb of data created
per person, per second
by 2020
44Zb of data by 2020
80% of data isn’t
currently analyzed in
any way.
Even 10% added access
to analyze could result
in +$50 million in
revenue for every
fortune 100 company.
The Numbers Don’t Lie
6. Scripting Tips
• Create “Wrapper” scripts that can be used to create
a uniform experience for script execution-
• Example: <wrapper name> <script name>
<arguments>
• Use a script directory for ease of management and
ease of backup/recovery/change control
• Use a version control repository, such as Github,
Git, etc.
• Don’t save passwords or keys in databases and
don’t save passwords in scripts or files on OS.
• Use proper security at OS level to ensure who can
read, write and execute scripts.
7. Automation and the Cloud is
Key to Much of the Tech
Acceleration
• Less intervention from resources to
maintain, manage and operate database
platforms.
• Simple Allocation of Technology via the
Cloud.
• Advancement in tools to eliminate resource
demands.
• More tool interconnectivity, allowing less
steps to do more
• A resurgence of scripting to enhance
automation and graphical interfaces to
empower those who have wider demands.
8. Our Use Case
• Large Percentage of
University Customers in
MHE Searching for same
solution
• EDU team created a
Popular Solution that
allowed for community
involvement
• Multi-Tier Deployment-
SQL Databases, ADF,
Analsis Services, Data and
Power BI
• Ever-evolving
9. How do I know this?
Three TSP Data
Platform
Architects to
Cover US
100’s of HigherEd
Customers
All of them
interested in a
solution vs. a
product.
10. So Many Choices…
• Multiple options to automate-
• Azure CLI
• PowerShell with Azure Commands
• Azure DevOps
11. How Do From Point A to Point Automate?
• Perform the task in the User Interface FIRST.
• Gather the information with the CLI to build out your scripts
• Test and retest comparing to the UI deployment
• Manage and maintain automation.
• Don’t go back to manual processing or manual intervention.
• Build out in phases- physical to logical, enhancing and improving as
we go along.
13. Azure CLI
• Allows a command line interface to the Azure
cloud
• Can be installed locally on Windows, Mac and
other OS platforms
• Can be run inside a Docker Container
• Can be used with the Azure Cloud Shell
without installation.
• Flexible and robust, allows for a CLI solution
and automation via scripting in
PowerShell/BASH of Azure deployments.
https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest
14. AZ CLI is Simple to Use and Robust
• >C:EDU_Docker>az vm create -n LinuxTstVM -g dba_group --
image UbuntuLTS --generate-ssh-keys
• SSH key files 'C:Userskegorman.NORTHAMERICA.sshid_rsa'
and 'C:Userskegorman.NORTHAMERICA.sshid_rsa.pub' have
been generated under ~/.ssh to allow SSH access to the VM.
If using machines without permanent storage, back up your
keys to a safe location.
• - Running ..
• C:EDU_Docker>az vm list –g dba_group
• C:EDU_Docker>az vm delete -n LinuxTstVM -g dba_group
• Are you sure you want to perform this operation? (y/n): Y
16. Locating
Information on ADF
• 1. Create one in the
GUI
• 2. Inspect the
resource facts via the
CLI:
az resource list
--location
eastus
17. Use the
Azure Portal
Document Document as you automate, checking in docs with code. Don’t
make anyone guess how you’ve accomplished what you’ve done.
Automate Automate everything that can be automated via APIs, scripts and
processes, building as much out in the portal as you’re able.
Recycle Don’t reinvent the wheel.
Use Use monitoring and alerting features of the Portal
19. HigherEdAnalyticsSolution
3 Data Factory 7 Analysis Services 8 Power BI1 Student Information System
• Contains all source
data necessary for
achieving the goals
defined for the proof
of concept
2 VPN Gateway
SSIS DB
Data Warehouse
Staging Database
5
4
6
20. Deployment vs. Enjoyment
• Teams from universities are made up of varied technical
backgrounds
• Data Scientists, DBAs, Data Analysts and Educators
• Spend More time deploying than working with the solution
• Slows down the time the EDU’s limited resources get to
work one-on-one with the customers
• Discovered some customers lost interest during the
deployment phase or didn’t have the time to deploy
21. Goal
Simplify
Simplify the deployment
with DevOps practices
Remove
Remove demands on
knowledge required to
deploy the solution.
Create
Create more time for
interaction and working
with the solution by the
education teams.
22. Build a Roadmap
Document All Pieces of
Deployment
• Identify any interactive vs. default
entries that will benefit the
deployment.
• Update documentation as you go.
Don’t try to do it in the end.
1
Categorize by Physical, Logical
and Interactive
• Build out Physical Deployment first, as
it is the foundation.
• Build in easy restart and clean up steps
to physical deployments
• Test and update any documentation to
reflect the automation
2
Begin to automate logical
slowly and in phases.
• Remove any manual steps or
configurations.
• Take advantage of any plugins that ease
work on end-users side
• Continue to accept feedback and
enhance.
3
23. What is
Involved
1. Two SQL Databases- one staging and one data warehouse
2. Azure Data Factory with a SSIS Database
3. Azure Analysis Services
3. Three Power BI Reports
4. CSV files for ongoing data loads, which will need to be
configured for ongoing workloads
• 5. Multiple Solution and project files, some deprecated in VS
2017
* Data loads and configuration via Visual Studio or SSDT
solutions already built in.
* Sample data files in Excel could be replaced with the
customers own data.
24.
25. The Solution Was Very Repeatable
• Most resources and databases could have the same name in every
deployment.
• Outside of the CSV files containing example data, everything else
could be deployed without any changes by the customer if a few
dynamic parameters were pushed to the deployment.
• Although official documentation existed, there were numerous
versions and the process had evolved with the introduction and ease
of Azure deployment.
26. Automate
the Logical
Migrate any onsite SSIS
pkgs and workflows to
Azure Data Factory
ADF will build out an
SSISDB in Azure SQL
database
Store projects in
pipelines
Schedule, report and
check into a Github
repository to automate
development cycle.
SLN and PROJ files are
recycled.
27. Visual Studio/SSMS Dev Tools
• My predecessor and team members built in some automation already
using solution files!
• Awesome, these can be reused!
28. Walk Before
You Run..
Began to
deploy
individual
resources.
Had a final
merge script,
(aka wrapper)
with a test
script.
Deployed
piece by piece
until phase I
was
completed.
Received
feedback from
peers and
customers as
proceeded.
29. Azure CLI Isn’t Enough – Cloud Shell
• Enhanced Azure CLI commands into BASH script to deploy and automate.
• A script to enhance automation and set variables to ease customer skill
requirements was required.
• From the Azure Portal:
Or Direct: https://shell.azure.com/
30. Initial Build in BASH
• Well, I’m a Linux Person
• Script is interactive, accepting customer’s requested
naming conventions and requirements.
• Builds out the physical resources in Azure
• SQL Server with a data warehouse and staging
database
• Azure Data Factory
• Azure Analysis Server
• Creates firewall rules for Azure Cloud Shell
• Creates all user access
• Creates Database objects
31. Use A Repository Implemented into the
Automation
https://github.com/EDUSolution/AutoEDU
33. Living Documentation
Made easier
transition as
redesigned.
Kept track of all
moving parts.
Offered insight
to those who
knew previous,
manual process.
Allowed for
roadmap to be
included.
Allowed for
troubleshooting
section as other
sections shrunk
with automation.
35. What About
PowerShell?
• Love of the SQL Community
• One the roadmap for future,
secondary choice for
deployment
• Still deciding if a PowerShell
version is required.
• Need to wait for one tier that
doesn’t deploy from a PS1
script.
36. Powershell Makes
the World Go ‘Round
But does it?
• As a DBA, you’ll need more than just
PowerShell to build the future of Azure.
• Learn BASH scripting basics.
• Consider Python basic knowledge if not
scripting skills.
• Azure commands
• Cross platform and cloud platforms will
be needed.
37. Benefits and Drawbacks
Terraform Azure CLI PowerShell/BASH
Freemium, but great user
community
Newer product, but driven to support Azure PowerShell has a leg up on
BASH in the SQL World
Scripting is proprietary Scripting can be done in Powershell or BASH Scripting is powerful
Can build out anything that
has a command line option
Is very interactive. If you want to script it, must do
more.
Can do more than just
deploy. Scripting can
become more application
based.
Requires Subscription,
tenant, client API and
password to be passed
Good for check commands and settings Can be as much as needed.
Was a bit overkill for what I
was doing
Offers more support than Terraform, that can’t use
some of the defaults
Azure CLI was made to
work with both.
38. Next Steps
• PowerShell version is in mid-development
• More likely know PowerShell over BASH if Microsoft professional.
• Will only require two main scripts to be updated with enhancements
and additions to the repository.
• Automate the SSIS, (Integration Services) and Data Factory steps,
(currently an unknown, so figuring out.)
• Build out scripts to scale up the current “demo” version to an enterprise
version.
• Script to automatically pause integration and factory services to save on
cost.
• Script out data workload inputs for new features/data model and
proprietary data loads.
• Build out all steps using Azure DevOps to continue the growth of the
automation.
39. Success
• Manual Process takes between two full days of onsite
meetings, to 8 weeks of remote meetings to deploy.
• New Automated process deploys in less than 15
minutes after customer answered questions in
interactive script.
• Offers extensively more time for customer to work
with solution and Microsoft architects to work with
providing value to the customer on how to use Power
BI with Azure.
• In first week, over two dozen customers requested
POC deployment with little assistance from heavily
limited resource team, allowing for more valuable
allocation of resources.
40. You Must Evolve
Data WILL come from
more than just SQL
Server and Azure SQL
Databases.
01
You will be expected
to do more with less.
02
Embrace automation
tools inside the
database platform,
(backup, recovery,
optimization,
maintenance tasks.)
03
Equal to 5200Gb of data for every human being.
That’s 40 trillion GB
To hit these numbers, data doubled every two years since 2012.
Most of this data is created by machines as they talk to each other.
33% of the data that we aren’t currently analyzing is valuable.
MHE= Microsoft Higher Education
How do I know?
This is happening on my own team.
More customers, accelerated development cycles, wanting to do more with less and fewer technologists, all
Spread thin across tons of technical areas.
My example.
The highered solution is something we want to use with more universities and colleges. We don’t have the resources to help more customers deploy it.
Takes up to 8 weeks to deploy with our customers all the tiers of the environment.
My skills in BASH far outweight my skills in PowerShell, so with all the new technology, it makes sense to work in the one of two options that I’m more familiar with. It upgrades the opportunities for success.
Next phase will be in PowerShell, on the roadmap.
Json’ing themselves to death with automation scripts.
Almost every SQL DBA has a suite of Power Shell scripts to manage their databases.
Future development of Azure will be BASH first, PowerShell second.
BASH is more robust and should be considered as a second scripting language. Not Perl, OK, maybe YAML and JSON, but BASH will serve at many levels, not just Azure.
Python, over R, will serve the data community as the data libraries evolve and advance for Python. They will be built much faster than R.
After first demonstration, main high ed university recommended it to their 13 secondary universities to simplify and unify their data platform.