This new Azure Sentinel offering from Microsoft looks fantastic. It looks to improve the usual “SEIM” offerings out there. SEIM is an acronym for security information and event manager platform. This product or service can be set up and viewed right with Azure, of course.
The usual [overpriced] “SEIM” tools do not quite have the full Cloud ready set of tools available with Sentinel. Sentinel is, in a nutshell:
“Azure Sentinel is a cloud-native security information and event manager (SIEM) platform that uses built-in AI to help analyze large volumes of data across an enterprise—fast. Azure Sentinel aggregates data from all sources, including users, applications, servers, and devices running on-premises or in any cloud, letting you reason over millions of records in a few seconds. “
Microsoft Azure has introduced a PostreSQL GUI extension. This is part of the Azure Data Studio. They have created a really nice graphical user interface to manage not just one postgres database, but multiple ones. There is nothing wrong with command line but for getting certain types of tasks or work done, this is a huge improvement.
The GUI provides for a thorough overview and yes, visualization of databases, servers, tables, indexes and more. In addition, the new tool allows for connecting to the database directly or to a local or cloud-based server. The PostgreSQL extension allows for color coding of different servers within the GUI for ease of use, if there is more than one server.
The new tool allows for locating database objects, queries by way of IntelliSense, creating query templates, customizing the editor and Git source control integration.
Azure Archive Storage is perfect for rarely referenced or used data. Whether the data is archived health, government, business, or any type of data, the data may nonetheless need a place to be stored, ‘just in case’ … Or it may be a legal or organizational requirement that mandates all the data be stored away. Azure archive storage is low cost storage for just this purpose. In other words, this data simply needs to be securely stored away, preferably at a low cost.
This Azure archive storage is perfectly suited for any organization tired of using old tape back ups as well as for aging video and other multimedia content. It is also perfect for corporate or governmental requirements mandating data be kept for say, 7-14 years. In addition, the data storage is automatically encrypted after transfer.
General Azure storage pricing is available in tiered pricing, with Archive Storage having the “lowest storage cost and higher data retrieval costs”. In other words, if truly rarely accessed and destined for long-term storage, data stored at this tier is a very good deal.
The services and products available in Azure Marketplace is always growing. It is a very impressive market, with offerings in categories ranging from “Compute” [of course!], to Analytics, Databases and to Security and Identity. In fact, Identity services look very intriguing: “Alert Logic” and “ZScaler” target a relatively new acronym: “BYOL” (Bring your own license). The “ZScaler” service in particular is interesting in that its service can “create fast, secure connections between users and applications, regardless of device, location, or network”. Their connector can be installed within the Azure Cloud instance. “ZScaler” looks to be very useful for both private and hybrid clouds.
This is a very interesting real world read about a large company moving to Microsoft Virtual Desktop Infastructure.
Rakuten Group Secures Sensitive Data with Virtual Desktop Infrastructure
“… Rakuten has turned to Microsoft Windows Server 2016 Remote Desktop Services (RDS). Not only does RDS provide an easy path to integrating heterogeneous systems, but it also provides an additional layer of security so new systems do not compromise Rakuten’s existing corporate infrastructure.”
This is fantastic – onsite data can be VERY, VERY large, or ‘heavy’, depending on how you define it in non technical terms. Moving or migrating from an office [or even a traditional datacenter] to a Cloud service can be daunting, given the amount of data needing to be uploaded to a provider. Uploads through the Internet can conceivably take days or weeks! Enter the “Data Box” or smaller “Data Box Disk” from Microsoft Azure. These secure devices can be ordered from Azure. Once they arrive, simply plug them into your network [or server], then rapidly transfer crazy amounts of data to them before shipping the device back to Azure for upload to your Cloud account.
“Azure Data Box Family
Data migration to Azure made fast, simple, and secure
Now offering Azure Data Box with 100TB capacity, and Data Box Disk with up to 40TB capacity
From terabytes to petabytes, choose the device that works for your migration needs
Both devices keep your data safe and secure with AES encryption
Order, fill, and return for upload to Azure – all tracked in the familiar Portal”
I really like this way of thinking outside the box! Some of the old, and current, concepts on password complexity, length, history etc. are being revised. There is some new thinking on the matter, based mainly on trends and analytics Microsoft has done via millions of hack attempts on Azure based resources.
New Microsoft recommendations:
“Maintain an 8-character minimum length requirement (and longer is not necessarily better).
Eliminate character-composition requirements.
Eliminate mandatory periodic password resets for user accounts.
Ban common passwords, to keep the most vulnerable passwords out of your system.
Educate your users not to re-use their password for non-work-related purposes.
Enforce registration for multi-factor authentication.
Enable risk based multi-factor authentication challenges.”
We had a few break ins in the neighborhood recently so I decided to set up an outdoor surveillance camera. But I needed to upload motion detected videos to an FTP type of site. So I had to provide for video file storage for an outdoor WiFi based security IP camera. I will use a D-Link video camera and a cloud based location to store the videos. As this is for home use, there is no server. I used to have servers at home, but nowadays, I work off Azure or other Cloud based companies and it is no longer needed or feasible: the server is cloud-based. Besides, home servers are too loud, although I when I had them at home, they were pretty nifty ;>
6.59 Terabytes disk space, on a Solid State Drive?? WOW. [not o mention 448 GB of RAM!)
“We have just recently announced the new series of VM sizes for Microsoft Azure Virtual Machines called the G-series, providing the most memory, the highest processing power and the largest amount of local SSD of any Virtual Machine size currently available in the public cloud. It easily handles deployments of mission critical applications such as large relational database servers (SQL Server, MySQL, etc.) and large NoSQL databases as well as the most demanding, very large scale-up enterprise systems.
G-series offers up to 32 vCPUs using the latest Intel® Xeon® processor E5 v3 family, 448GB of memory, and 6.59 TB of local Solid State Drive (SSD) space.”