Redgate Database DevOps for SQL Server
Johannesburg | 04 March
Cape Town | 07 March
Join Redgate & Blue Turtle Technologies at our upcoming event ‘Compliant Database DevOps for SQL Server’ and learn how you can deliver software faster while keeping your data safe.During this lunchtime workshop, you’ll have the opportunity to hear from industry experts as they discuss the challenges of DevOps adoption and share their experiences of how to successfully implement Compliant Database DevOps across teams.Network with like-minded data professionals and learn the skills you need to ensure your business benefits from a DevOps approach to database development, while minimizing risk.If you’re a C-level executive, director, manager, senior database professional, architect, or engineer working on the Microsoft Data Platform, this is the event for you!
12:00 Registration and lunch
12:30 Welcome – Blue turtle introductions and setting the scene
12:45 How to provision production data securely with SQL Provision, Chris Kerswell, Redgate
14:00 Keep your delivery processes secure with Compliant Database DevOps Chris Kerswell, Redgate
15:00 Q&A / Wrap up, Closing remarks
According to the IDC, the world will generate 50 times the amount of data as in 2011, with 75 times the number of information sources.
Potentially, that amount of data can be of great use as organisations seek to find ways that will drive better efficiencies, while looking at customer experience and optimisations that will improve overall performance – across various departments and stages of the client engagement.
To turn these potential opportunities into actionable data companies require clear insights, powered by software that makes data analyses fast, easy and useful.
Tableau builds software that does exactly that.
Making data understandable (and useful)
From its days as a startup at Stanford University, the fundamental ingredient for Tableau was the drive to make data understandable to all people. A philosophy built on three key pillars:
- Liberate data: Data analysis should be about asking questions, not learning software. Data must be free to tell stories that can be easily understood by those who need it.
- Empower people: Tableau’s self-service tool helps people feel respected, capable and powerful – motivating employees to drive their organisations forward in a different, insightful, way.
Ivanti is built to help democratize data – giving people that ability to think, act and deliver.
- Design for people: Tableau is built from the ground up to put the user first and everything else (frankly) comes second. The company believes helping people to see and understand data is one of the most important missions of the 21st century.
What other companies say about Tableau?
From Financial, Healthcare and Public sector services to Travel, Retail and Communications-related services – companies from just about every industry are gaining significant insights through Tableau.
Lufthansa, for example, increased efficiency by 30% while gaining flexibility and departmental autonomy through its Tableau engagement.
Although Lufthansa has been a very analytical and data-driven company for a long time, there was no uniform group reporting in 2016.
Heiko Merten, Head of BI Applications in Sales at Lufthansa, remembers: “Each department had its own reporting system and there were no uniform standards. The development of each new analysis had to be formally applied for in the IT department and often took a long time because of the high workload, which resulted in a constantly growing backlog.”
“With Tableau, it is much easier to consolidate different data sources in a fast and interactive format,” explains Christian Novosel, Head of Strategic BI Initiative at Lufthansa. “We can now make data-based decisions live in a meeting. Acceptance reaches all the way to the board level with our CFO, who supports our initiative.”
Liberate your data with Tableau and Blue Turtle
Tableau’s products are transforming the way people use data to solve problems.
The software makes analysing data fast, easy, beautiful and useful. It’s software for anyone and everyone.
Blue Turtle, as South African technology partners, look forward to helping local businesses liberate their information (and their teams) to make data-driven-decisions that offer ‘a single source’ of truth.
We believe there is significant scope for South African companies to capitalise more on the efficient use of information and invite you to reach out to our team of specialists to discuss ways in which our partnership with Tableau can benefit your business.
Redgate Customer Experience
Cape Town | 03 July
Johannesburg | 04 July
You are invited to join your industry peers and Blue Turtle, South Africa’s experts in Redgate’s Database DevOps solutions, for an introduction to the benefits of bringing DevOps practices to your database development processes. This lunch event will provide you with valuable insights about how DevOps, as a change agent, can enable you to deliver value quicker AND keep your data safe. Meet with other customers, share your experiences and help shape the future of your environment. Agenda Highlights: 12-12.30 – Registration and lunch 12.30 – Welcome – Blue Turtle 12.45pm – 1.30pm – Tom Austin Redgate, Improving IT performance by bringing Secure DevOps to the database 1.30pm – 2pm– Break 2pm – 3pm –Tom Austin Redgate Continued 3pm-3.30pm – Q&A
You are invited to join your industry peers and Blue Turtle, South Africa’s experts in Redgate’s Database DevOps solutions, for an introduction to the benefits of bringing DevOps practices to your database development processes.
This lunch event will provide you with valuable insights about how DevOps, as a change agent, can enable you to deliver value quicker AND keep your data safe.
Meet with other customers, share your experiences and help shape the future of your environment.
12-12.30 – Registration and lunch
12.30 – Welcome – Blue Turtle
12.45pm – 1.30pm – Tom Austin Redgate, Improving IT performance by bringing Secure DevOps to the database
1.30pm – 2pm– Break
2pm – 3pm –Tom Austin Redgate Continued
3pm-3.30pm – Q&A
Choose the toolbelt that’s right for you
Free, unsupported tools included in the SQL Toolbelt
DLM Dashboard tracks your database schemas and alerts you when they change.
DLM Dashboard monitors up to 50 of your databases, and sends you an email alert as soon as your databases start to drift, or change from their expected state.
SQL Search is a free add-in for SQL Server Management Studio that lets you quickly search for SQL across your databases.
Drawing a distinction between test automation and continuous testing may seem like an exercise in semantics, but the gap between automating functional tests and executing a continuous testing process is substantial.
Any true change initiative requires the alignment of people, process and technology—with technology being an enabler and not the silver bullet. Yet there are some basic technology themes we must explore as we migrate to a true quality assurance process. In general, we must shift from a sole focus on test automation to automating the process of measuring risk. To begin this journey, we must consider the following:
From Causal Observations to Probabilistic Risk Assessment
With quality assurance (QA) traditionally executing manual or automated tests, the feedback from the testing effort is focused on the event of a test passing or failing—this is not enough. Tests are causal, meaning that tests are constructed to validate a very specific scope of functionality and are evaluated as isolated data points. Although these standalone data points are critical, we must also use them as inputs to an expanded equation for statistically identifying application hot spots.
The SDLC produces a significant amount of data that is rather simple to correlate. Monitoring process patterns can produce very actionable results. For example, a code review should be triggered if an application component experiences all of the following issues in a given continuous integration build:
- Regression failures greater than the average
- Static analysis defect density greater than the average
- Cyclomatic complexity greater than a prescribed threshold
From Defect Documentation to Simulated Replay
The ping-pong between testers and developers over the reproducibility of a reported defect has become legendary. It’s harder to return a defect to development than it is to send back an entrée from a world-renowned chef. Given the aggressive goal to accelerate software release cycles, most organisations will save a significant amount of time by just eliminating this back and forth.
By leveraging Service Virtualization for simulating a test environment and/or virtual machine record and playback technologies for observing how a program executed, testers should be able to ship development a very specific test and environment instance in a simple containerised package. This package should isolate a defect by encapsulating it with a test, as well as give developers the framework required to verify the fix.
From Structured Data to Structured and Unstructured
The current tools and infrastructure systems used to manage the SDLC have made significant improvements in the generation and integration of structured data (e.g., how CI engines import and present test results). This data is valuable and must be leveraged much more effectively (as we stated above in the “From Causal Observations to Probabilistic” section.
The wealth of unstructured quality data scattered across both internal and publicly-accessible applications often holds the secrets that make the difference between happy end users and unhappy prospects using a competitor’s product. For example, developers of a mobile application would want constant feedback on trends from end user comments on:
- iTunes app store
- Android app store
- The company’s release announcements
- Competitors’ release announcements
This data is considered unstructured since the critical findings are not presented in a canonical format: parsing and secondary analysis are required to extract the valuable information. Although these inputs might be monitored by product marketers or managers, providing these data points directly to development and testing teams—in terms that practitioners can take action on—is imperative.
From Dashboards to Business Policies
In a Continuous Everything world, quality gates will enable a release candidate to be promoted through the delivery pipeline. Anything that requires human validation clogs the pipeline. Dashboards require human interpretation—delaying the process.
Dashboards are very convenient for aggregating data, providing historical perspectives on repetitive data, and visualizing information. However, they are too cumbersome for real-time decision making because they do not offer actionable intelligence.
Business policies help organisations evolve from dashboards to automated decision making. By defining and automatically monitoring policies that determine whether the release candidate is satisfying business expectations, quality gates will stop high-risk candidates from reaching the end user. This is key for mitigating the risks inherent in rapid and fully-automated delivery processes such as Continuous Delivery.
From Tool Dependent to SDLC Sensors
Let’s face it—it’s cheap to run tools. And with the availability of process intelligence engines, the more data observations we can collect across the SDLC, the more opportunities will emerge to discover defect prevention patterns.
Given the benefit of a large and diverse tool set, we need to shift focus from depending on a single “suite” of tools from a specific vendor (with a specific set of strengths and weaknesses) to having a broad array of SDLC sensors scattered across the software development life cycle. And to optimise both the accuracy and value of these sensors, it’s critical to stop allowing tools to be applied in the ad hoc manner that is still extremely common today. Rather, we need to ensure they are applied consistently and that their observations are funneled into a process intelligence engine, where they can be correlated with other observations across tools, across test runs and over time. This will not only increase the likelihood of identifying application hot spots, but will also decrease the risk of false negatives.
New Continuous Testing Book: Continuous Testing for IT Leaders
You can learn more about evolving from automated testing to Continuous Testing. Read the new 70-page book Continuous Testing for IT Leaders to learn how Continuous Testing can help your organisation answer the question “Does the release candidate have an acceptable level of business risk?”
You’ll learn how to:
- Collaboratively close the gap between business expectations and DevTest activities
- Establish a safety net that helps you bring new features to market faster
- Make better trade-off decisions to optimise the business value of a release candidate
- Establish feedback loops that promote incremental and continuous process improvement
This book provides a business perspective on how to accelerate the SDLC and release with confidence. It is written for senior development managers and business executives who need to achieve the optimal balance between speed and quality.
TrapX Security™, a global leader in advanced cyber security defense, today announced that Blue Turtle, leaders in solutions for optimization and management of IT systems, has chosen TrapX DeceptionGrid™ to expand its security service portfolio. The partnership helps solidify TrapX’s growth in South Africa by providing Blue Turtle’s customers with the industry’s leading deception-based technology.
DeceptionGrid automates the deployment of a network of camouflaged malware traps that are intermingled with real information technology resources. If malware touches DeceptionGrid just once, it sets off a high-confidence alert. Real-time automation isolates the malware and delivers a comprehensive assessment directly to an organization’s security operations team.
“We offer a comprehensive network security portfolio of products to our customers,” said Martyn Healy, Marketing Director at Blue Turtle. “But, as we’ve seen in recent months, there’s always going to be some element of risk even to organizations that have bottomless pockets and spend untold millions attempting to build a fortified network perimeter. We are excited to partner with Trapx, as we believe that TrapX DeceptionGrid seeks to offer an extra layer of protection and mitigation that’s been proven reliable in case an attacker does manage to penetrate our perimeter defenses.”
“The fact is there is no one foolproof way to protect an organization’s data against aggressive attackers and crime syndicates. It’s no longer a question of ‘if’ a large organization has been penetrated, but ‘when,” said Carl Wright, General Manager of TrapX Security. “What’s important is a layered security approach that includes a fully-featured firewall, endpoint and deception protection. A properly configured network security stack with DeceptionGrid substantially reduces the time to breach detection and practically eliminates false positive alerts, which is one of the biggest complaints coming out of IT departments today. We are pleased to be offering Blue Turtle’s customers peace of mind that our software will help protect them from the latest malware and advanced persistent threats.”
Myth Busters! Have you ever received conflicting information about DCIM? How about wrong information? Data center infrastructure management (DCIM) is a complex topic, so it’s no wonder that misperceptions and spin-doctoring are common.
Register for Nlyte Software’s educational webinar to find out more on DCIM myths, then discuss the pertinent facts and steps to take when faced with a DCIM myth.
Webinar Registration : Common Myths About DCIM – Debunked
Choose a session on Wednesday, April 29:
Session 1 at 15:00 GMT / 10am ET
Session 2 at 2pm ET / 1pm CT / 11am PT
Please select the below courses you would like to register, as well as the number of students you will be registering. A 15% discount is offered for the third delegate if a single customer registers three delegates for a single course.
Please note that invoices will be emailed to the contact email address provided, proof of payment will secure your registration.
If you are experiencing issues submitting this page, or require assistance please email firstname.lastname@example.org.
Blue Turtle partners with Quorum to deliver resilient, reliable, cost-effective onQ On-Site Disaster Recovery services to protect data and applications for enterprises of all sizes.
Blue Turtle, a leading South African technology management company, today announced its partnership with Quorum, the number one provider of one-click backup and disaster recovery (DR) for small to mid-sized business.
Blue Turtle has expanded its infrastructure product portfolio with Quorum’s award-winning hypervisor-based DR solution, onQ On-Site. onQ On-Site is the building block of Quorum’s unique high-availability (HA), DR, and disaster recovery as a service (DRaaS) capabilities. Organisations may choose DRaaS, protecting on-premises applications to the cloud, and in-Cloud DR, protecting applications deployed in the cloud, and replicated to a secondary cloud data centre.
Quorum’s onQ On-Site effectively maintains up-to-date virtual machine clones of critical systems, and transparently takes over failed servers within minutes. Leveraging Quorum technologies, Blue Turtle now offers a simple, fast and cost-effective platform to protect and recover critical IT services for users; whether service interruption is caused by man-made or natural disaster, equipment failure or data loss.
“Disaster recovery is often considered a first step for small to medium enterprises transitioning to a fully outsourced IT model, consisting of co-location, managed hosting, and cloud. The Quorum platform allows organisations to test the viability of this strategy, while meeting critical business continuity objectives,” says Avash Maharaj, Infrastructure Business Manager for Blue Turtle Technologies. “With Quorum’s onQ On-Site offering, enterprises can eliminate months’ worth of cloud compute costs, by maintaining a disaster recovery strategy.”
With Quorum, downtime events are essentially eliminated, ensuring instant asset restoration, so that daily activities remain uninterrupted and business continuity prevails. “Easy to use, cost-effective and PCI-compliant, the Quorum onQ On-Site solution is the best defence against costly downtime events, which are particularly vexing for mid-market government and educational organisations,” said Walter Angerer, CEO at Quorum. “With its Quorum-powered onQ HA solution, customers can rest assured that they have the very best in disaster recovery protecting their critical assets.”
Worldwide disaster recovery and business continuity arrangements look bleak, and South African (SA) enterprises echo these trends. According to an EMC-sponsored survey: “The top three causes of data loss and system downtime in SA are loss of power (56%), hardware failure (51%) and software failure (50%)”1. This is hurting business in loss of employee productivity and loss of revenue. The ability to instantly recover critical data, systems, and applications after a failure or disaster is crucial to the continuity of business operations.
David Fisk, EMEA Sales Director at Quorum, reiterated: “Many organisations today need a highly-available disaster recovery solution to protect critical applications and data, but without the traditional price tag or complications. As a result, Quorum’s highly regarded technology is plugging a gap in the market with its simple to manage and cost-effective platform. In fact, Network Products Guide 2013 Best Products Awards announced Quorum as a finalist for the ‘Best Cloud Storage and Backup Solution’, as this is the only solution in the industry that delivers assured one-click recovery in minutes.”
1 EMC Disaster Recovery Survey 2013: South Africa