Training: October 4th
0800 Check-In | 0900 – 1700
Securing SQL Server Workshop
SQL server is the most widely-used database platform in the world, and a large percentage of these databases are not properly secured, exposing sensitive customer and business data to attack. This one-day seminar is essential for DBAs and developers who need to produce secure database applications and manage secure databases. In addition to demonstrating the latest security technologies such as Always Encrypted, Dynamic Data Masking, and Row Level Security, this seminar gives practical advice and engaging examples on how to defend your data — and ultimately your job! — against attack and compromise. Perhaps just as significantly, you will learn about current, real examples that illustrate the potential consequences of not following these best practices. You will leave the course armed with the skills required to recognize actual and potential database vulnerabilities, implement defenses for those vulnerabilities, and test those defenses for sufficiency. This seminar follows the approach of showing you what is at risk, how to secure it, and how to audit access so that you can be confident you have secured SQL Server effectively.
This workshop will work on practical applications of VD/ERM to employ Decision Analysis mathematical modeling in conjunction with the Decision Quality valuation process as a viable model for Risk Management.
Stanford University’s Value-Driven Enterprise Risk Management (VD/ERM) is traditionally positioned as a decision quality process use for long-term (5-20 year) multi-billion dollar projects where many variables are highly volatile or unpredictable — the types of decisions a company would need to make for oil exploration, pharmaceutical research, defense weaponry and even technology strategy roadmaps.
But what if we could re-purpose the context of VD/ERM for use in information security decisions — similar in the fact that there is very limited, volatile and sometimes unpredictable information available — but through the use of today’s computing power can create better quality decisions for extremely short event horizons.
Artifact-Based Risk Management
Many of today’s risk management evaluation methods fall into one of three (3) major categories:
- Asset / Actor Identification
- Vulnerability / Threat Tree
- Scenario / Situational Threat Matrix
These methods all rely on starting with known quantities, applying assumptions and deducing a set of artifacts (people, places, events) where the enterprise is the most vulnerable. They all follow a Critical Path Management (CPM) protocol; and are well suited for areas of limited scope; as is the case in many physical security operations.
Where CPM has shortcomings is when the scope of the protection has both known areas as well as unknown areas – i.e. too many paths of discourse – as with many internet-based operations. The input factors to Risk Management are tried and true; it is the perception and usage of those inputs that can determine a novel output matrix that can help mitigate against the unknown actor, threat or situation.
Implementing the NIST Cybersecurity Framework Workshop
In 2013, US President Obama issued Executive Order (EO) 13636, Improving Critical Infrastructure Cybersecurity, which called for the development of a voluntary risk-based cybersecurity framework (CSF) that is “prioritized, flexible, repeatable, performance-based, and cost-effective.” The CSF was developed through an international partnership of small and large organizations, including owners and operators of the nation’s critical infrastructure, with leadership by the National Institute of Standards and Technology (NIST). In this session we will discover how the framework works, how to implement it and what the proposed changes as it frameworks moves to version 1.1