With Halloween just around the corner, I was recently explaining to my young children that while decked out in their costumes, they will need to politely say, “Trick or Treat” if they would like to receive some candy. My 6-year old son asked, “Why would I want a trick? I want treats!”
Isn’t that what we all want? Treats, not tricks!
Unfortunately, there are some ghosts and goblins running around who are taking some liberties with the term “cloud”. It appears their overall objective is to trick the end-user community into thinking they have a cloud-based solution, when in fact they do not. The term “cloud washing” transcends numerous industries today, but it’s really being thrown around a lot in the physical security channel. Cloud washing is defined as the purposeful and sometimes deceptive attempt by a vendor to rebrand an old product or service by attaching the buzzword "cloud" to it.
We saw a similar trend take place with the term “green”. Companies were falling all over themselves to associate the term “green” with their products. But in most cases, it was a misleading claim about the environmental benefits of a product, service, technology or company practice. In both cases, just saying you are something doesn’t magically make it true.
I feel it is imperative that we continue to educate our industry on what the real cloud is. The National Institute of Standards and Technology (NIST) released a document in September 2011 which defines cloud computing. The NIST Definition of Cloud Computing clearly states that cloud is composed of 5 essential characteristics:
- On-demand, self-service
- Broad network access
- Resource pooling
- Rapid elasticity
- Measured service
The implications of all of these standards mean that you can provision services yourself to serve large populations efficiently; that you only pay for exactly what you need, when you need it; and you may share a common infrastructure across all customers by logging into the same system as everyone else who uses the system.
When I started at Brivo almost 10 years ago, the term cloud did not yet exist. In doing a bit of research online, it appears the term started to gain momentum around 2006. Now you can hardly make it past page one in most security publications without the term in the title of an article, in an advertisement, or as part of a story. In some cases, the advertiser is stretching the truth quite a bit, and in fact may be misapplying the term altogether. Make sure you do your research and don’t just take their word for it--who wants to be tricked when you really want to be treated!
-Dave Williams, Senior Director of Strategic Accounts
By now, the whole world knows that Amazon founder Jeff Bezos has purchased the struggling Washington Post for $250M. Not everybody agrees on why he made this purchase though. There are several theories emerging, foremost of these is that Mr. Bezos will use his expertise in the high-tech world to push the newspaper into the drastic changes it should have made much earlier. I’d like to argue that Mr. Bezos actually bought this newspaper to further propel his online business. At the heart of any strategy is whether the purchase brings in additional profit or it helps some other (often existing) business. In this case, we all agree that the Post is losing money (hence the sale) but could it be profitable for Amazon? Possibly. Here are some potential reasons, all or none of them may be the actual reasons that prompted Mr. Bezos’ purchase:
1) Convert print subscribers to digital and drive more Kindle sales
This is easier said than done; otherwise the Post would have done it. Amazon is different though and Mr. Bezos is even more different, notably in that he has no urgency to turn a profit. With monthly subscribers (including Sunday) of approximately one million, there are a million potential customers waiting to be sold a Kindle. Of course, he doesn’t need to charge anything for the hardware (keeping the spirit of low or negative margins in the short term for longer-term profits) but simply lock them up with a 12 or 24 month contract at the same price that they are already paying. Say 20% convert. A lot of them will be hooked – not only on the Kindle (or e-reader) but also to the digital platform…and guess where they start going first for online shopping?
2) Spread into the content provider landscape from purely eCommerce
Buying the Post, everybody agrees that Mr. Bezos got the content. He also got the reporters, publishers, editors, etc. who create quality content. What that means for Amazon is that it can all of a sudden start competing in the content landscape of the internet. Mr. Bezos may not be thinking of creating a search portal immediately (though you can never be sure) but at least he can threaten the existence of several struggling content providers and establish Amazon in a Top 3 spot easily in an area where it presently doesn’t even compete. A brand new customer base using the same content!
3) Leverage the Post’s distribution network
Mr. Bezos not only acquired the content from the WP but also its wide distribution network. For every one household that is a subscriber there are another ten in the neighborhood that are not. That means there are roughly ten million households that are easily accessible to the WP network. What do you do with that? Bring everything that Amazon sells by eCommerce into a catalog. Whether the subscribers use the print catalog or the neighbors who have already migrated to digital use the internet, they are still buying from Amazon. Amazon could take the network even farther (if it wanted to!) – use the same distribution to deliver lightweight merchandise. Amazon has been trying to create delivery methods as a differentiator, now it does not need to purchase one separately or rely on a third party.
4) Sell the Post for a profit
If all or some of the above (or something else) works for the Post and it sees a turnaround, there could be a renewed interest in other print media. That would mean an increase in the valuation of the Post (and this is different from the actual difference that Mr. Bezos can make – this is just the industry multiple that Wall Street places on such businesses). All this would mean that a few years down the road, Mr. Bezos could simply turn around and sell it back for a profit!
So, if you are thinking that Jeff Bezos is some rich guy looking to satisfy his ego by fixing a problem that had not been fixed by so many other smart people for a number of years, think again. If my assessment about Bezos is right, he will transform a currently struggling company and eventually help transform the industry. This represents one more way that the innovations in the cloud industry are transforming businesses all around us, every day.
What will Brivo’s cloud-based security expertise do to transform other industries? Stay tuned…
-Rajeev Dubey, Sr. Director of Marketing
What’s it like to be part of an engineering team in a hi-tech product company? Brivo offered students from nearby Bethesda-Chevy Chase High School a chance to find out by hosting an internship program during the 2012/2013 school year. We were interested in aiding and mentoring students in the process of making important decisions about college and their future career paths. What better way than to offer real-life work experience and a taste of the engineering world?
We were originally interested in starting such a program because of our prior recruitment challenges. When we tried to enlist top talent — especially at the college level — we often lost out to better-known companies. By advertising ourselves early on, we hoped to gain recognition among other employers and connect with potential top talent.
From the beginning, we handled our program differently than other internships. To start, we asked to host a team instead of an individual student, to mirror group collaboration in the workforce. After sharing our approach with the school’s internship coordinator, we built a group of four students to form our Brivo Engineering Intern Team. The team spent the equivalent of one to two class periods per weekday at Brivo.
We initially arranged one extended assignment for the interns, however the plan morphed into a series of projects that provided our interns these invaluable lessons for the future:
- “Big picture” thinking: We asked them to always think about the “big picture” or the context before beginning an assignment. We encouraged them to ask “why” when they didn’t understand something and to consider the implications of their approach to a project.
- “Wax on, wax off”: The students didn’t get this reference to the (now old) movie Karate Kid, but we tried to instill learning and improving fundamental skills. Coming in, the interns expected to become experienced with computer programming. Before starting a software development project, we asked them to learn about Agile development methodologies. They not only explained and presented what they learned, but used it as part of their work process.
- How a product company works: Even though the interns were officially part of the engineering department, we wanted to give them a total overview of a product company. The interns learned about sales, marketing, operations, administration and other departments through Q&A sessions and short assignments.
- Trust and commitment: It’s important to learn that trust is earned, and depends on one’s ability to honor commitments and deadlines. The interns learned that if they did not take their work seriously, it could negatively affect their grades. In the end, however, they were able to bounce back and experienced the trust that accompanies completing important tasks.
- Teamwork: Since the success or failure of an assignment impacts everyone, the interns learned to work together as a team. They learned the value of giving and receiving feedback from their teammates.
- Agility in practice: As the program progressed, we had to make adjustments while keeping the overall objectives intact. Thus, the interns learned to evaluate trade-offs and to prioritize tasks.
At the end of the program, we held a debrief session with our interns to obtain feedback about the year. Participants widely considered it a positive and educational experience. One intern expressed interest in doing an independent summer project with mentors from Brivo. Another referred her sibling to a Brivo summer internship. The school appreciated the experience we provided, and we've already committed to four new interns for this upcoming school year. We believe our inaugural program was successful not only because we increased our recognition as a technology company in the local community but more importantly because we invested in four young talents who someday may be able to link their success to an internship at Brivo.
-Rudy Setiawan, Senior Director of Engineering
As a cloud company, it hurts me to say it…but some organizations exist that have little need for cloud computing or Software as a Service (SaaS).
Most of the time, they are hard to spot. On the outside, they look like normal companies—working to beat their competitors and grow their business. They seem to have the same resource constraints and budget pressures as the rest of us. However, once you explain the benefits of cloud computing and SaaS, they reveal to you a magical array of computing power and internal resources that they can brandish on the simplest of corporate problems.
Once I hear about all the investments they’ve made and the awesome capabilities they have, I must concede that, “the cloud is not for you."
As a public service to all cloud and SaaS salespeople of the world, I’m providing this simple guide to pinpointing organizations where the cloud is redundant. Here’s a handy checklist to determine if your latest prospect falls into this “cloud-free zone”.
The cloud is not for those who…
• Always do their back-ups
• Keep their software patches up to date
• Have a solid disaster recovery plan
• Have multiple data centers for redundancy
• Pay nothing for their internal computing resources
• Have an internal Service-Level Agreement for 99.999% uptime or better
• Have no problems getting devices and users on-line
• Have data centers with multiple paths to the internet
• Have an ongoing vulnerability and penetration testing program
• Have quick and free access to as many IT professionals as needed
• Always have the budget for annual software support and upgrades
• Monitor system status and penetration attempts around the clock
• Can afford purchasing excess capacity so that it will available for later expansion
For the lucky few who possess these vast assets, I salute you and declare that the cloud is not for you. For the rest of us who face any or all of the challenges listed above, the cloud is definitely for you.
- John Szczygiel
Today, the designing, debating and color selections are over. After several months of planning, we’re moving into our new headquarters, one that reflects our roots and future.
So, what makes this new space special? Most notably, we’ve adopted an open seating plan to increase communication and foster our culture of innovation and customer service. We’ve christened part of the office the “Brivo University Training Center,” an area with a 25-person capacity and on-site and distance learning technologies. This area will give our dealers the tools they need to deepen their product knowledge and further grow their business. Other features include an experience center, demonstration rooms, technology testing labs, multiple conference rooms, and a snappy, well-stocked café for employees to socialize and recharge. Our new, high technology HQ will help us retain and attract top talents who otherwise look for opportunities outside of the Washington, DC area or flock to the local consultative or government contracting world. We are at the forefront of creating Silicon Valley Metro DC.
As a SaaS company, Brivo is devoted to protecting the environment by eliminating the need for local servers. In our new headquarters, we’re expanding this commitment by working in a LEED-certified building, encouraging environmental awareness and energy efficiency. Aside from maximizing energy savings, the lighting control system is integrated with other office technologies so we can demonstrate to our dealers and technology partners what their customers can do in their office spaces.
Our design is decidedly focused on using technology to share information. The entire HQ is wired with Apple TVs, giving us a crisp, collaborative way to display information and metrics for meetings, trainings, and demonstrations by simply “AirPlaying” such data. These innovations will undoubtedly drive our commitment to customer service.
So today, we move into our new home at 7700 Old Georgetown Road. We stand in the heart of Bethesda, MD, closer to downtown shopping and dining, closer to metro rail and bus access, and still a stone’s throw from the nation’s capital. Our flagship office marks a new era for the company that pioneered the use of the cloud for access control and video surveillance long before anyone knew there were clouds beyond cumulus. We’re delighted to offer an imaginative, attractive space for our growing team, one where we can welcome our dealers, partners and guests.
The Brivo Team
P.S. To learn more about our move and to see more pictures, visit www.brivo.com/onthemove.
After more than 10 years of talking to customers about Software as a Service and cloud computing, I’ve heard a lot of interesting reactions. Most of them come about because of the collective failure of vendors to adequately (or honestly) explain what the cloud is all about.
Today, I’m going to give you my collection of the top three weirdest ones I’ve heard and the meaning I interpret from these misapprehensions.
“I like everything about the cloud except the computers aren’t in my office.”
This is like saying, “I like everything about ice cream except that it’s cold.” After a few questions, I determined that this customer simply liked browsers. That’s all. Just browsers. As for everything behind the browser—cloud data centers, dedicated IT staff, disaster tolerance, remarkable uptime—none of that mattered. All that counted was just pointing and clicking in a familiar user interface.
Of course, it’s everything behind the browser that makes the cloud such a powerful solution—massive scale, automatic updates, built-in redundancy, lower costs, multi-tenancy…to name just a few. The confusion here isn’t surprising, given how some vendors disingenuously interchange the terms Web-based (anything that connects to an IP network) and Web-hosted (applications that actually run in a data center).
“I’m just going to build my own cloud.”
This is the same as a gas station owner saying, “I’m going to refine my own diesel.” It ignores all the investment and massive scale required to produce fuel economically, not to mention the glaring question of refining expertise. Yet this is exactly what many who are new to the cloud game are attempting to do. Somehow they never got the memo that the cloud only works when it operates on a very large scale. There’s no such thing as a one-computer cloud, regardless of what the purveyors of “private cloud” appliances will tell you.
“How do they decide who gets to sell pieces of the cloud?”
This is a case of taking the cloud metaphor way, way too literally, as if there were just one actual cloud and we all had to share it. Is advertising the culprit, or is it just the cloud metaphor itself? Admittedly, it’s difficult to communicate a complex and abstract subject like “the cloud” in just a few words or a single image. The media picture is that the cloud is an amorphous computational gas out there that they have harnessed for your needs and then charge you for the pleasure.
We all know the cloud is really just a bunch of computers hooked together. But that begs the question, “What makes it so special?” The answer is that all these computers can be shared, dynamically, with much greater efficiency, while keeping customer data safe with multi-tenant software architectures. I’m not sure this explanation helps that much for the layman, but I’ll just keep on trying.
The cloud will eventually become like the air we breathe—something we don’t even really notice, yet depend on vitally. Will everyone finally understand the cloud at that point, or will it just magically vanish from discussion?
When was your last serious discussion about the mysteries of running water and sewage?
-Steve Van Till
Fans of the Hitchhiker’s Guide to the Galaxy will recall that the “Answer to the Ultimate Question of Life, the Universe, and Everything” is 42. That’s it. No context. No meaning. 42.
As the Big Data enterprise begins to produce more “answers” it’s important that we not accept the sort of context-free results we recall from Hitchhiker. Instead, we need to compare one set of results to another to see where we stand, whether it’s loss prevention, incident rates, video analytics, or any other dimension of security. Below, we’ll explore both the sources and uses of normative data in Big Security Data.
In its simplest form, normative data is statistical samples of large data sets that provide answers to question like:
- How many times does X happen per week at a commercial office building?
- How many times does X happen in retail stores vs. commercial property?
- What percentage of employees or visitors exhibit behavior Y nationwide?
- Has the percentage changed since last year?
- Are incident rates worse at certain kinds of properties?
- Are my incident rates worse than national or regional norms?
- Is there anything out of the ordinary in this week’s data?
If Big Data could provide this sort of information, wouldn’t that be a huge improvement in the way we practice security?
Earlier in my career, I worked in the healthcare informatics business. Unlike the security industry, healthcare is rich in normative data sources. They are collected by doctors and hospitals and public agencies, reported to states and quality boards, and analyzed extensively by for-profit companies trying to give their clients an edge.
For almost any situation, a consumer, provider, or insurance company can compare performance and cost against known averages that are sliced and diced in myriad ways. This allows all stakeholders to have more productive conversations about the “facts on the ground” and how they compare to current best practices, historical performance, comparable stakeholders, or any other measure deemed relevant.
Other real-market examples of norms that help improve overall industry performance include: airline on-time performance statistics; automobile quality ratings; manufacturing defect rates; consumer product safety ratings; financial services performance; and the list goes on.
Unfortunately, data in the security industry is largely fragmented and not available for analysis outside of a single enterprise. This makes any attempt at standardized norms or comparative evaluation a rather parochial exercise. Such compartmentalization of data is largely a byproduct of the stovepipe system architectures that dominate our software vendors, and the absence of any regulatory reporting requirement to draw the data out.
Cloud computing is beginning to surmount the challenge of stovepipes, now that SaaS vendors are able to anonymously aggregate data for the benefit of their entire customer base. Look in the fine print of almost any SaaS agreement and you’ll likely find one or more terms indicating your consent to anonymous data aggregation. And that’s the starting point for deriving valuable information for the industry as a whole.
No one vendor will ever hold all the data, but that doesn’t mean individual SaaS services can’t still provide enormous benefit through Big Data offerings. In the healthcare industry, it was often “valuable enough” for a hospital to compare itself to just a subset of other hospitals. That’s because a random sample of part of a group tends to exhibit the same statistical properties as the whole group, or be close enough in many cases to be valuable enough for performance improvement.
And the answer you’ve been waiting for? That answer is actually 48, not 42. At least that’s what 3 million of our anonymous users tell us about how often the typical door is used each day.
-Steve Van Till
Assuming your organization has formulated a strategy and set goals for what you want to achieve with Big Data, there are several paths toward implementation.
To my knowledge, there are no on-premise physical security systems with a Big Data solution built into their core deployments. That’s because Big Data is a relatively new technology, and no one has included it yet in their feature set. It’s also because the technology platforms used for Big Data and those used for security are very different. Finally, few security systems have been set up to marshal all the necessary data into one place, where it can be usefully analyzed with Big Data techniques. In any case, there are three broad approaches worth considering for your Big Data project.
One way to evaluate options is to consider where they fall on the build vs. buy continuum. If your company has deep IT resources, perhaps with other Big Data projects already in place, it may be more attractive to graft your security data strategy onto the tools and talents already at work in other departments. If you have few IT resources and always look to outside vendors for solutions, you will approach this more as a “shopping” exercise than a build-out.
The “roll your own” option begins with exporting data from your current systems and aggregating it onto a Big Data platform where you can perform subsequent analysis. This is known as ETL, or Extract, Transform, and Load. You’ll need to do this because the typical security database platform will not support Big Data operations.
After ETL, the difficult task of programming one of the many Big Data technologies to perform your particular analysis will begin. For this, you’ll want to have access to someone called a “data scientist” in addition to your software developers.
If building IT solutions from the ground up isn’t for you, a second option is to transfer your data to an online Big Data solutions provider and work with them to extract the value you seek. Along with many industry stalwarts, there are now many start-ups operating in this arena.
This approach enables you to avoid both ramp-up and capital expenses. The learning curve for Big Data technologies can be significant, depending on your goals, so you may not want to be burdened with the expense or time for that process to play out. Similarly, you may not wish to invest in the technology up front, and cloud solutions offer the same flattened expense profile as traditional SaaS business applications.
One disadvantage, given that this is a new field, is vendor longevity. Pick the wrong vendor and you may find yourself migrating to a new provider, and that’s tough given there aren’t really any standards for data portability in this domain.
The last option is to wait and see which security vendors emerge with built-in solutions. This will likely occur first among enterprise systems providers, with an advantage toward cloud offerings. Cloud vendors can distribute Big Data solution costs across everyone in their customer base who chooses to use it, rather than asking you to buy a whole Big Data stack to put in next to your other servers.
The trick here is to recognize that every industry vertical will have different Big Data strategies, seeking to extract different types of value. That means there will be no one-size-fits-all offering, and you’ll do best to choose the vendor who can extensively customize your solution.
-Steve Van Till
In this blog entry, we drill down on what types of security data make up a Big Data strategy and what types of analytics might help extract value from that data.
Let’s go back to two inextricably linked phrases: “Big Data strategy” and “extract value”. In other words: no strategy, no value. Whatever your company hopes to achieve with Big Data, you need to have a strategy, or you won’t be capturing and storing the data that delivers results. More on this later.
We’re all pretty familiar with the types of data generated by physical security systems…or are we? Truth is we ignore most of it because it is too tedious or outright impossible to analyze with traditional tools.
Examples include endless access control activity logs, or millions of motion detection events from even a modest-sized video surveillance deployment. What about sensors deployed to monitor doors, windows, cabinets, or anything that opened or traversed? In an enterprise of any size, very quickly we get a mountain of routine events that we don’t analyze for larger patterns.
Here’s where the familiar structured query approach of most report generation tools fails to reveal the bigger patterns that emerge from large data sets and better analytics. That’s because structured queries only reduce large sets of records to smaller set of records, possibly with a few calculations thrown in. They’re not designed to detect global patterns and long-term trends. Enter Big Data tools.
My enterprise has 500 locations, and I want to understand how all these facilities compare over the past five years. Our first problem is probably that the local security systems are an archipelago of pre-cloud information islands. Assuming it’s possible to aggregate that data, or work with a cloud provider who can, there are many new questions we can answer:
- How do daily security patterns compare across my facilities?
- Is traffic flow the same this year as last year?
- What signals precede an actionable security event?
- What’s the year-over-year change in visitor-to-employee ratios?
- Which facilities exhibit the most off-hours video events?
- Where are administrative privileges changed most often?
- How do the preventive maintenance signals compare?
- Which locations stand out this week? This month? Every month?
- What is the seasonal variation in security events?
- How do my large facilities differ from my small facilities?
- How do my physical security data correlate with other data sources?
The list could go on, all using discrete data types, without even discussing how video analytics at this scale can alter the value extracted from an enterprise security data repository. At this level, we simply regard video analytics as another input to other Big Data tools.
Mixing in other data sources enhances what we learn from security data. This should be part of a Big Data strategy to gain maximum business benefit from the investment. Possible sources: HR, compliance, network, certifications, sales, maintenance, shrinkage, safety, weather, and many others depending on your business and what you want Big Data to do for you.
The first thing to do is find out whether your security systems are collecting the types of data you need for your Big Data strategy. If not, work with your vendors to augment existing systems, or possibly switch vendors to establish a suitable source of data. The key question: can you marshal all relevant data into one (virtual) place so you can extract the value.
Here’s where cloud storage and cloud providers of Big Data services come in handy. They allow you to move data into the right environment and store as much as you want, without a huge initial investment in servers or software.
It’s as if the cloud and Big Data were made for each other.
-Steve Van Till
Big Data is the newest Big Buzzword, and it’s rolling across the IT landscape like the fast-paced “cloud” buzz-storm that preceded it. Given the vast amounts of unanalyzed data we collect in our industry, the tools and analytic techniques of Big Data hold the promise of extracting more business value from our security systems than ever before.
We’ve been using data for a long time, usually as relational databases that allow us to make queries and see the results in our desired visual form. Isn’t Big Data just a really large version of the database tools we’ve known for decades? There are a number of competing definitions of “true” Big Data.
There’s the “3 Vs” camp: volume, velocity, and variety. Enough of these three in any combination and you’ve got Big Data. There’s no real description of the technology, and it ends up sounding a lot like Business Intelligence—advanced analytical and visualization techniques, but on a much larger scale.
Next, there’s the viewpoint that Big Data brings us the ability to understand large volumes of data at high velocity, but doing it with so-called “unstructured” data in real time. Think of gathering unfolding intelligence from random texts on social media in order to predict stock market direction. Traditional tools won’t do this, so you’ll inevitably be shopping for some new software technologies.
This second camp comes close to our third way of defining Big Data, which focuses on the technologies needed to handle extremely large data sets “too large to process using traditional techniques.” That’s vague I know, but it drives the question to where practitioners are talking petabytes and exabytes of data as the threshold for true Big Data.
Big Data techniques are now being used to provide insight into many scientific and business questions. For years, science has generated massive data sets in particle physics, meteorology, genomics, and many other fields. Scientists had to develop custom analytical techniques for each of these domains. Now, they can exploit the power of Big Data tools forged by cloud computing and made widely available through the open source software movement.
Businesses with access to rich data sets can use these new techniques to understand complex customer behavior patterns to improve market share. A recent MIT study found that companies using Big Data to their advantage were 5% more productive and 6% more profitable than their peers—those are impressive advantages.
Closer to home, the security industry is part of a growing trend toward adoption of Cyber Physical Systems (CPS) with widespread sensor networks that produce exponentially more data every year. This vast quantity of data is useless unless it can be meaningfully analyzed to produce business value.
And there’s the rub: you need to know which problems you want to solve. And for that, as one McKinsey columnist quips, the most important tool is…the pencil. What security problems can we solve for our customers if we had all the right data? If we start from what outcomes we want to achieve—then we’ll avoid the trap of making cool charts just for the sake of cool charts.
Would understanding variations in employee behavior help identify potential inside threats? Would large enterprises allocate security dollars more efficiently if they could compare one facility against another? Can security departments improve their own system performance if they could benchmark it against their peers?
Time to sharpen our pencils.
-Steve Van Till