bginfo (4.24)

The content below is taken from the original ( bginfo (4.24)), to continue reading please visit the site. Remember to respect the Author & Copyright.

Automatic desktop backgrounds that include system information

Learning ARM assembly with visUAL

The content below is taken from the original ( Learning ARM assembly with visUAL), to continue reading please visit the site. Remember to respect the Author & Copyright.

Learning assembly is very important if you want to get a grasp of how a computer truly works under the hood. VisUAL is a very capable ARM emulator for those interested in learning the ARM assembly.

The GUI: A simply program to ADD two numbers

In addition to supporting a large subset of ARM instructions, the CPU is emulated via a series of elaborate and instructive animations that help visualise the flow of data to/from registers, any changes made to flags, and any branches taken. It also packs very useful animations to help grasp some of the more tricky instruction such as shifts and stack manipulations.

As it is was designed specifically to be used as teaching tool at Imperial College London, the GUI is very friendly, all the syntax errors are highlighted, and an example of the correct syntax is also shown.

Branch visualisation, credits: VisUAL homepage

You can also do the usual things you would expect from any emulator, such as single step through execution, set breakpoints, and view data in different bases. It even warns you of any possible infinite loops!

That being said, lugging such an extravagant GUI comes at a price; programs that consume a few hundred thousand cycles hog far too much RAM should be run in the supported headless mode.


Filed under: Microcontrollers

Phone Security

The content below is taken from the original ( Phone Security), to continue reading please visit the site. Remember to respect the Author & Copyright.

...wait until they type in payment information, then use it to order yourself a replacement phone.

onenote (16.0.8730.2122)

The content below is taken from the original ( onenote (16.0.8730.2122)), to continue reading please visit the site. Remember to respect the Author & Copyright.

Microsoft OneNote Online Desktop Client

wakemeonlan (1.82)

The content below is taken from the original ( wakemeonlan (1.82)), to continue reading please visit the site. Remember to respect the Author & Copyright.

Turn on computers on your network with Wake-on-LAN packet

The Truth About The Tesla Semi-Truck

[], Thank you for your amazing support this year! Help this channel get better by supporting at Patreon:


My 2017 Patreon Expense Report:

Listen to our new podcast at:
Showmakers YouTube channel at:

RSS and Libsyn Audio is available on our site:

Get your Real Engineering shirts at:



Thank you to my patreon supporters: Adam Flohr, darth patron, Zoltan Gramantik, Henning Basma, Karl Andersson, Mark Govea, Mershal Alshammari, Hank Green, Tony Kuchta, Jason A. Diegmueller, Chris Plays Games, William Leu, Frejden Jarrett, Vincent Mooney, Ian Dundore, John & Becki Johnston. Nevin Spoljaric

Once again thank you to Maeson for his amazing music. Check out his soundcloud here:

Microsoft Remote Connectivity Analyzer: Troubleshoot Office 365 apps & services issues

The content below is taken from the original ( Microsoft Remote Connectivity Analyzer: Troubleshoot Office 365 apps & services issues), to continue reading please visit the site. Remember to respect the Author & Copyright.

Remote Connectivity Analyzer

Formally released as Exchange Server Remote Connectivity Analyzer, Microsoft Remote Connectivity Analyzer is a tool that allows you to analyze, troubleshoot and fix Office 365 apps & other Microsoft services. The tool houses a collection of web-based tools enabling a user […]

This post Microsoft Remote Connectivity Analyzer: Troubleshoot Office 365 apps & services issues is from

Control a Quadcopter over Websockets

The content below is taken from the original ( Control a Quadcopter over Websockets), to continue reading please visit the site. Remember to respect the Author & Copyright.

The interface

Everyone’s favourite IOT module, the ESP8266, is often the go-to choice for any project that needs quick and cheap control over the web. [Andi23456] wanted to control his quadcopter using the luxury of his mobile phone and thought permanently tethering an ESP12-E module to the quadcopter was exactly what he required.

The ESP8266, really showcasing its all-round prowess, hosts both a web server for a HTML5 based joystick and a Websockets server so that a client, such as a phone, could interact with it over a fast, low latency connection. Once the ESP8266 receives the input, it uses interrupts to generate the corresponding PPM (Pule Position Modulation) code which the RC receiver on the quadcopter can understand. Very cool!

What really makes this realtime(ish) control viable is Websockets, a protocol that basically allows you to flexibly exchange data over an “upgraded” HTTP connection without having to lug around headers each time you communicate. If you haven’t heard of Websockets you really should look really check out this library or even watch this video to see what you can achieve.

Filed under: drone hacks, Wireless Hacks

Micro-ATX Arduino is the Ultimate Breakout Board

The content below is taken from the original ( Micro-ATX Arduino is the Ultimate Breakout Board), to continue reading please visit the site. Remember to respect the Author & Copyright.

If you’ve been hanging around microcontrollers and electronics for a while, you’re surely familiar with the concept of the breakout board. Instead of straining to connect wires and components to ever-shrinking ICs and MCUs, a breakout board makes it easier to interface with the device by essentially making it bigger. The Arduino itself, arguably, is a breakout board of sorts. It takes the ATmega chip, adds the hardware necessary to get it talking to a computer over USB, and brings all the GPIO pins out with easy to manage header pins.

But what if you wanted an even bigger breakout board for the ATmega? Something that really had some leg room. Well, say no more, as [Nick Poole] has you covered with his insane RedBoard Pro Micro-ATX. Combining an ATmega32u4 microcontroller with standard desktop PC hardware is just as ridiculous as you’d hope, but surprisingly does offer a couple tangible benefits.

RedBoard PCB layout

The RedBoard is a fully compliant micro-ATX board, and will fit in pretty much any PC case you may have laying around in the junk pile. Everything from the stand-off placement to the alignment of the expansion card slots have been designed so it can drop right into the case of your choice.

That’s right, expansion slots. It’s not using PCI, but it does have a variation of the standard Arduino “shield” concept using 28 pin edge connectors. There’s a rear I/O panel with a USB port and ISP header, and you can even add water cooling if you really want (the board supports standard LGA 1151 socket cooling accessories).

While blowing an Arduino up to ATX size isn’t exactly practical, the RedBoard is not without legitimate advantages. Specifically, the vast amount of free space on the PCB allowed [Nick] to add 2Mbits of storage. There was even some consideration to making removable banks of “RAM” with EEPROM chips, but you’ve got to draw the line somewhere. The RedBoard also supports standard ATX power supplies, which will give you plenty of juice for add-on hardware that may be populating the expansion slots.

With as cheap and plentiful as the miniITX and microATX cases are, it’s no surprise people seem intent on cramming hardware into them. We’ve covered a number of attempts to drag other pieces of hardware kicking and screaming into that ubiquitous beige-box form factor.

Filed under: Arduino Hacks, computer hacks, Microcontrollers

Storage Christmas cracker: My band is called 1023MB. We haven’t had a gig yet

The content below is taken from the original ( Storage Christmas cracker: My band is called 1023MB. We haven’t had a gig yet), to continue reading please visit the site. Remember to respect the Author & Copyright.

Last 2017 roundup: Death Star file protection crap, Samsung DRAM miracles and more

Here’s a final collection of storage news before Christmas 2017. Imagine you are having Christmas dinner and you get a bunch of crackers. Pull them and, instead of fortune cookie statements and bad jokes, this list of news items tumbles out.…

Happy 5th Anniversary, XRP Ledger!

The content below is taken from the original ( Happy 5th Anniversary, XRP Ledger!), to continue reading please visit the site. Remember to respect the Author & Copyright.

Ripple has had a lot to celebrate over the last year!

From adding new RippleNet members, to launching our first blockchain conference, to completing the escrow of 55 billion XRP, it’s clear that the team’s hard work to enable the Internet of Value (IoV), coupled with the XRP community’s support, is paying off.

In fact, XRP has experienced phenomenal growth — up nearly 12,000 percent — from December 2016 to December 2017. This makes XRP’s performance better than any other digital asset in the industry in 2017.

But just as the year is coming to a close, there’s one more reason for the Ripple team and the XRP community to celebrate — the five year anniversary of the XRP Ledger.

XRP sets the standard from the start

Since 2012, XRP has been considered the only digital asset with a clear institutional use case that works to solve the multi-trillion dollar liquidity problem that plagues banks, payment providers and corporates.

For example, XRP can provide liquidity to financial institutions who need to send cross-border payments — presenting a greater opportunity for these institutions to widen their footprint in major corridors or gain more access to emerging markets.

Cuallix, a major payment provider, is the first RippleNet member to use xRapid — Ripple’s solution that uses XRP to source liquidity — to reduce the cost of sending cross-border payments and remittances from the U.S. to Mexico.

The XRP Ledger, the digital asset’s underlying distributed ledger technology, is the key reason why XRP steadily led the pack of digital assets by being the fastest, most cost-effective, scalable digital asset for payments, which helps advance its goal —  enabling IoV.

IoV is the belief that money should move in the same fashion as digital information moves today — in real time.

In order to deliver on this vision, Ripple continues to support the development and adoption of  the XRP Ledger to ensure that XRP remains the fastest, most scalable digital asset on the market.

Additionally, the digital asset market is seeing more demand for XRP which speaks to consumers’ understanding that XRP is the best digital asset to remove the friction from global payments.

Ripple pledges to increase XRP usage

The Ripple team is committed to furthering the adoption and usage of XRP for payments. In fact, we continuously work with regulators, governments and central banks to better understand how our technology can help remove the friction from cross-border payments.

And, as more digital assets flood the market, the value of digital assets will be determined by their utility and the problem they solve for their users.

Since XRP, has continually provided a strong use case and is undisputed in its technological achievements, it makes sense that the market would take notice — prompting a surge in XRP’s value.

What’s more, XRP is now available to more than 50 exchanges worldwide to better serve the global demand for the digital asset.

It has been a phenomenal five years, and the Ripple team is appreciative of the community’s support. We can’t wait to see what the future years will bring!

Watch our live Q&A with David Schwarz today. As Brad Garlinghouse said in his livestream, “David Schwartz is the Steph Curry of cryptography.”

The post Happy 5th Anniversary, XRP Ledger! appeared first on Ripple.

MDT (6.3.8450.0)

The content below is taken from the original ( MDT (6.3.8450.0)), to continue reading please visit the site. Remember to respect the Author & Copyright.

Microsoft Deployment Toolkit (MDT) is a free computer program from Microsoft that assists with the deployment of Microsoft Windows and Office

Audi’s latest models add Amazon Music to the dashboard

The content below is taken from the original ( Audi’s latest models add Amazon Music to the dashboard), to continue reading please visit the site. Remember to respect the Author & Copyright.

If you're an Apple CarPlay or Android Auto user, you've no shortage of music streaming services baked into your dashboard. But, if you're relying on your vehicle's default control panel the choices start to dwindle. While, automakers like Ford have s…

megui (1.0.2774)

The content below is taken from the original ( megui (1.0.2774)), to continue reading please visit the site. Remember to respect the Author & Copyright.

Portable video converter front-end for many free command line tools

PowerShell for SharePoint Online Usage Scenarios

The content below is taken from the original ( PowerShell for SharePoint Online Usage Scenarios), to continue reading please visit the site. Remember to respect the Author & Copyright.

PowerShell is not only a powerful tool to administer and manage a SharePoint Online (SPO) tenant but also for common activities as an Office 365 Administrator or an SPO. In this article, I will cover some of the most common PowerShell for SharePoint Online usage scenarios as described in Figure 1.



Figure 1– Common PowerShell for SPO Usage Scenarios.

Service Configuration and Administration Scenarios

Under these scenarios, we have any action that implies to apply specific SPO settings available through SPO PowerShell cmdlets and/or SPO APIs. Some examples of typical operations that fall under these scenarios are the following ones:

  • While it’s true that OneDrive for Business (ODFB) and SPO provides support for hashtag and percent symbols in files names and folder names, you need to explicitly enable in your tenants by using PowerShell. Note that there is not a way to enable the support for these characters in the SPO Administration UI. To enable the support for these symbols in ODF and SPO, you must use Set-SPOTenant cmdlet as follows:
Set-SPOTenant -SpecialCharactersStateInFileFolderNames Allowed

  • Configuring sharing capability at the tenant or site collection level is very important when we want to share an Office 365 Group site with external users without adding them as a guest in the Group. To enable external users sharing in an Office 365 Group site, we only need to use Set-SPOSite cmdlet as detailed below:
Set-SPOSite -Identity $sO365GroupSite -SharingCapability ExternalUserSharingOnly

Auditing Operations and Reporting scenarios

On the one hand, Auditing Operations scenario is intended to provide information about what is happening at any logical containers in an SPO tenant (Site Collections, Sites, Lists, Document Libraries, etc) in regards to common operations, such as creating or updating content, making updates in SPO security model and so on. On the other hand, reporting generation scenario is about activities taking place in SPO that are also covered in this PowerShell usage scenario. Some good examples of these scenarios:

  • Get information about the SPO tenant logical and information architecture in terms deployed Site Collections, Sites, Lists and document libraries.
  • Get detailed information about security settings at different levels (Site Collections, Sites, Lists and document libraries, list elements and documents) such as:
    • SharePoint security groups in use
    • Users/Group members of each SharePoint security group



For instance, if you are asked to provide a report with all the members of each SharePoint Security Group configured on an SPO site, you only need to execute the following PowerShell script that uses SPO Get-SPOSiteGroup and Get-SPOUser cmdlets:

$spoSharePointGroups=Get-SPOSiteGroup -Site $sSiteUrl
foreach($spoSharePointGroup in $spoSharePointGroups){ 
Write-Host "Users in " $spoSharePointGroup.Title ":"
$spoUsers=Get-SPOUser -Site $sSiteUrl -Group $spoSharePointGroup.Title
Write-Host “ -> “ $spoUsers.LoginName
Write-Host “--------------------------------“ -ForegroundColor Green

  • Get detailed information about a SPO tenant:
    • Storage used in each site collection in the tenant
    • Changes happening in the tenant

For instance, to query the Office 365 audit log and get information about file activities happening in all the sites in the tenant simply execute the following PowerShell script:

$PSSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri -Credential $Cred -Authentication Basic -AllowRedirection
Import-PSSession $PSSession
Search-UnifiedAuditLog -StartDate 12/1/2017 -EndDate 12/7/2017 -RecordType SharePointFileOperation -Operations FileAccessed -SessionId "Docs_SharepointViews"-SessionCommand ReturnNextPreviewPage

SPO Solutions Deployment Scenario

PowerShell is a common vehicle to deploy solutions on top of SPO that also includes any kind of customization to new or existing SPO Sites. Under this scenario, we can find a wide range of possibilities:

  • Apply a common look and feel (for instance a theme) to all the sites defined under a specific site collection.
  • Provision the full information architecture required for an SPO solution being developed: Site Collections, Sites, Site Columns, Content Types, etc.
  • Deploy Apps or WebParts to new or existing SPO Sites.
  • Configure security model for the solution (SharePoint security groups, permissions level, permissions inheritance mechanism, etc).

As an example, you can create a new SPO list in an SPO site using the following PowerShell script that makes use of the client-side object model (CSOM) SPO API:

#Adding the Client OM Assemblies 
$sCSOMRuntimePath=$sCSOMPath + "\Microsoft.SharePoint.Client.Runtime.dll" 
$sCSOMPath=$sCSOMPath + "\Microsoft.SharePoint.Client.dll" 
Add-Type -Path $sCSOMPath 
Add-Type -Path $sCSOMRuntimePath 
#SPO Client Object Model Context
$spoCtx = New-Object Microsoft.SharePoint.Client.ClientContext($sSiteUrl)
$spoCredentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($sUserName, $sPassword) 
$spoCtx.Credentials = $spoCredentials 
#Creating the List
$spoListCreationInformation=New-Object Microsoft.SharePoint.Client.ListCreationInformation


Information Loading and Migration scenarios

Finally, last scenarios cover situations where it’s required either to upload data to SPO sites or to move/migrate information to SPO sites. Make note that this information could come from another SPO Site or event SPO tenant, from a SharePoint OnPremises farm or even from a corporate file server. Some examples of situations that are under these scenarios are the following:

  • Move documents from Local File Systems, Other Cloud Storage Services (DropBox, Box, GDrive), SharePoint On-Premises to SPO, and OneDrive For Business.
  • Load information in SPO coming from different information sources (Local files, SQL Database, non-SQL database, etc).

For instance, the following PowerShell script allows to upload information from a CSV file to an SPO list using SPO CSOM API:

$spoCtx = New-Object Microsoft.SharePoint.Client.ClientContext($sSiteUrl)
$spoCredentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($sUserName, $sPassword) 
$spoCtx.Credentials = $spoCredentials 
#Adding Data to an existing list
$spoList = $spoCtx.Web.Lists.GetByTitle($sListName)
foreach ($sItem in $tblItems) {
Write-Host "Adding " $sItem.SPOListItem " to $sListName"
$spoListItemCreationInformation = New-Object Microsoft.SharePoint.Client.ListItemCreationInformation





PowerShell for SPO is a tool not only for platform administration and configuration tasks but also for doing many other common activities as an SPO Administrator (or an Office 365 one) can require: Auditing Operations, Reporting, SPO Solutions Deployment, Data Loading, and Migration.

The post PowerShell for SharePoint Online Usage Scenarios appeared first on Petri.

Neural Network Learns SDR Ham Radio

The content below is taken from the original ( Neural Network Learns SDR Ham Radio), to continue reading please visit the site. Remember to respect the Author & Copyright.

Identifying ham radio signals used to be easy. Beeps were Morse code, voice was AM unless it sounded like Donald Duck in which case it was sideband. But there are dozens of modes in common use now including TV, digital data, digital voice, FM, and more coming on line every day. [Randaller] used CUDA to build a neural network that could interface with an RTL-SDR dongle and can classify the signals it hears. Since it is a neural network, it isn’t so much programmed to do it as it is trained. The proof of concept has training to distinguish FM, SECAM, and tetra. However, you can train it to recognize other modulation schemes if you want to invest the time into it.

It isn’t that big of a task to identify signals using your built-in neural network. However, this is a great example of a practical neural net and it does open the door to other possibilities. For example, automated monitoring of multiple channels would need something like this.

One interesting tidbit is that the neural network doesn’t really know what it is learning, so input samples could be IQ samples, audio, or even waterfall graphics. You just have to use the same input to train that you want to use during operation. In fact, the code apparently started out as an image classification network from a course by Stanford.

If this gives you the urge to go out and buy an RTL-SDR dongle, you might want to look at some reviews. What else could you do with an intelligent radio? We’ve already seen a different kind of neural network decode Enigma traffic.

Filed under: Wireless Hacks

Automating IAM Roles For Cross-Account Access Series Overview

The content below is taken from the original ( Automating IAM Roles For Cross-Account Access Series Overview), to continue reading please visit the site. Remember to respect the Author & Copyright.

The AWS Partner Network Blog has recently published a series describing a method to automate the creation of an IAM role for cross-account access, and how to collect the information needed for a partner to assume the role after creation. This post gives readers an overview of the series, summarizing each of the individual posts with links back to the original content for further reading.

As a reminder, cross-account IAM roles allow customers to grant access to resources within their account to a partner or other third parties while enabling the customers to maintain their security posture. Cross-account roles allow the customer to delegate access without the need to distribute key material, and without the burden on the third party to safely handle key material after receipt.

The blog series kicked off with a post that explained how to create a custom launch stack URL for AWS CloudFormation. The URL will take users directly to the CloudFormation Create Stack wizard, with values for the Amazon S3 template location, stack name, and default parameters already populated. The launch stack URL eliminates the need to exchange template files with the customer, and ensures that the customer is using the proper template with the correct values.

The second post describes how to use an AWS Lambda function to populate a AWS CloudFormation template with uniquely generated values. The series example uses an External ID, an ID that is unique for each end user. This value needs to be set within the CloudFormation template. When triggered, the Lambda function pulls down the default template, inserts a generated unique External ID into the template, and uploads the customized template to an S3 bucket. Once the template upload is complete, the end user is presented with a custom launch stack URL containing the unique template Amazon S3 location. Finally, we demonstrated how to use the Launch Stack icon to make the URL more visible to users.

The third post details how to reliably return the Amazon Resource Name (ARN) of the cross-account role created by AWS CloudFormation to a third party. As a reminder, the third party must use the ARN, as well as the ExternalID, when assuming the role in the end user’s account. The post demonstrates a CloudFormation custom resource designed to send the ARN back to the third-party account, which consumes the ARN and stores it for later use.

The final post of the series brings the details of the previous three blog posts together into one cohesive solution. It shows how to implement the automation of cross-account role creation for customer onboarding using the techniques described in each post in a completed workflow. The workflow creates a smoother onboarding experience for the customer while creating a secure way for the third party account to create resources within the customer account.

We hope that the blog series can help you and your company improve your customer on-boarding experience. You can avoid the sharing of sensitive keys and the error-prone approach of requiring your customers to cut and paste information in their account and your on-boarding portal.

About the Author

Erin McGill is a Solutions Architect in the AWS Partner Program with a focus on DevOps and automation tooling.

10 RISC OS gift ideas for Christmas

The content below is taken from the original ( 10 RISC OS gift ideas for Christmas), to continue reading please visit the site. Remember to respect the Author & Copyright.

Here are some thoughts for some RISC OS gifts to treat yourself or your RISC OS loved ones from 2017.

1. The latest 5.23 ROM was released. Get a copy the software on Get the latest release on SD card or combined with lots of great software.
2. The BBC BASIC manual, now updated after 25 years.
3. The latest DDE release, complete with a wealth of electronic reference materials.
4. The latest edition of !Artworks, now at 2.X3.
5. Contribute to a bounty to help this happen in RISC OS releases.
6. Relax with some new Arcade games from Amcog games.
7. Organizer 2.28 gives you the ultimate Calendar and Organiser on your RISC OS machine.
8. Get your Fonts back into order with Font Directory Pro.
9. Keep using your old software on new hardware with Aemulor.
10. A RaspberryPi is stocking sized with a price to match and opens up the RISC OS and Linux software world.

What would you like to see under the tree?

No comments in forum

RIP, AOL Instant Messenger

The content below is taken from the original ( RIP, AOL Instant Messenger), to continue reading please visit the site. Remember to respect the Author & Copyright.

We knew this day would come. One of the major parts of our formative years on the worldwide web — we called it that back in the day — will cease to be. AOL Instant Messenger (AIM) came to a close a few hours ago. While we've already eulogized it, i…

ADSL Robustness Verified By Running Over Wet String

The content below is taken from the original ( ADSL Robustness Verified By Running Over Wet String), to continue reading please visit the site. Remember to respect the Author & Copyright.

A core part of the hacker mentality is the desire to test limits: trying out ideas to see if something interesting, informative, and/or entertaining comes out of it. Some employees of Andrews & Arnold (a UK network provider) applied this mentality towards connecting their ADSL test equipment to some unlikely materials. The verdict of experiment: yes, ADSL works over wet string.

ADSL itself is something of an ingenious hack, carrying data over decades-old telephone wires designed only for voice. ADSL accomplished this in part through robust error correction measures keeping the bytes flowing through lines that were not originally designed for ADSL frequencies. The flow of bytes may slow over bad lines, but they will keep moving.

How bad? In this case, a pair of strings dampened with salty water. But there are limits: the same type of string dampened with just plain water was not enough to carry ADSL.

The pictures of the test setup also spoke volumes. They ran the wet string across a space that looked much like every hacker workspace, salt water dripping on the industrial carpet. Experimenting and learning right where you are, using what you have on hand, are hallmarks of hacker resourcefulness. Fancy laboratory not required.

Thanks to [chris] and [Spencer] for the tips.

Filed under: Network Hacks

Alibaba Cloud Becomes the First Cloud Computing Company to Obtain C5 Attestation with Additional Requirements

The content below is taken from the original ( Alibaba Cloud Becomes the First Cloud Computing Company to Obtain C5 Attestation with Additional Requirements), to continue reading please visit the site. Remember to respect the Author & Copyright.

Alibaba Cloud , the cloud computing arm of the Alibaba Group, announced today that it had completed its assessment for the Cloud Computing Compliance Read more at

Cloud storage now more affordable: Announcing general availability of Azure Archive Storage

The content below is taken from the original ( Cloud storage now more affordable: Announcing general availability of Azure Archive Storage), to continue reading please visit the site. Remember to respect the Author & Copyright.

Today we’re excited to announce the general availability of Archive Blob Storage starting at an industry leading price of $0.002 per gigabyte per month! Last year, we launched Cool Blob Storage to help customers reduce storage costs by tiering their infrequently accessed data to the Cool tier. Organizations can now reduce their storage costs even further by storing their rarely accessed data in the Archive tier. Furthermore, we’re also excited to announce the general availability of Blob-Level Tiering, which enables customers to optimize storage costs by easily managing the lifecycle of their data across these tiers at the object level.

From startups to large organizations, our customers in every industry have experienced exponential growth of their data. A significant amount of this data is rarely accessed but must be stored for a long period of time to meet either business continuity or compliance requirements; think employee data, medical records, customer information, financial records, backups, etc. Additionally, recent and coming advances in artificial intelligence and data analytics are unlocking value from data that might have previously been discarded. Customers in many industries want to keep more of these data sets for a longer period but need a scalable and cost-effective solution to do so.

“We have been working with the Azure team to preview Archive Blob Storage for our cloud archiving service for several months now.  I love how easy it is to change the storage tier on an existing object via a single API. This allows us to build Information Lifecycle Management into our application logic directly and use Archive Blob Storage to significantly decrease our total Azure Storage costs.”

-Tom Inglis, Director of Enabling Solutions at BP BPP_Rlbg

Azure Archive Blob Storage

Azure Archive Blob storage is designed to provide organizations with a low cost means of delivering durable, highly available, secure cloud storage for rarely accessed data with flexible latency requirements (on the order of hours). See Azure Blob Storage: Hot, cool, and archive tiers to learn more.

Archive Storage characteristics include:

  • Cost-effectiveness: Archive access tier is our lowest priced storage offering for long-term storage which is rarely accessed. Preview pricing will continue through January 2018. For new pricing effective February 1, 2018, see Archive Storage General Availability Pricing.
  • Seamless Integration: Customers use the same familiar operations on objects in the Archive tier as on objects in the Hot and Cool access tiers. This will enable customers to easily integrate the new access tier into their applications.
  • Durability: All access tiers including Archive are designed to offer the same high durability that customers have come to expect from Azure Storage with the same data replication options available today.
  • Security: All data in the Archive access tier is automatically encrypted at rest using 256-bit AES encryption, one of the strongest block ciphers available.
  • Global Reach: Archive Storage is available today in 14 regions – North Central US, South Central US, East US, West US, East US 2, Central US, West US 2, West Central US, North Europe, West Europe, Korea Central, Korea South, Central India, and South India.

Blob-Level Tiering: easily optimize storage costs without moving data

To simplify data lifecycle management, we now allow customers to tier their data at the object level. Customers can easily change the access tier of a single object among the Hot, Cool, or Archive tiers as usage patterns change, without having to move data between accounts. Blobs in all three access tiers can co-exist within the same account.

Flexible management

Archive Storage and Blob-Level Tiering are available on both new and existing Blob Storage and General Purpose v2 (GPv2) accounts. GPv2 accounts are a new account type that support all our latest features, while offering support for Block Blobs, Page Blobs, Files, Queues, and Tables. Customers with General Purpose v1 (GPv1) accounts can easily convert their accounts to a General Purpose v2 account through a simple 1-click step (Blob Storage account conversion support coming soon). GPv2 accounts have a different pricing model than GPv1 accounts, and customers should review it prior to using GPv2 as it may change their bill. See Azure Storage Options to learn more about GPv2, including how and when to use it. 

A user may access Archive and Blob-Level Tiering via the Azure portal (Figure 1), PowerShell, and CLI tools and REST APIs, .NET (Figure 2), Java, Python, or Node.js client libraries.

Figure 1: Set blob access tier in portal


CloudBlockBlob blob = (CloudBlockBlob)items;

Figure 2: Set blob access tier using .NET client library

Partner Integration

We integrate with a broad ecosystem of partners to jointly deliver solutions to our customers. The following partners support Archive Storage:

imageCommvault’s Windows/Azure Centric software solution enables a single solution for storage-agnostic, heterogeneous enterprise data management. Commvault’s native support for Azure, including being one of the first Windows/ISV to be "Azure Certified" has been a key benefit for customers considering a Digital Transformation to Azure. Commvault remains committed to continuing our integration and compatibility efforts with Microsoft, befitting a close relationship between the companies that has existed for over 17 years. This includes quick, cost effective and efficient movement of data to Azure while enabling indexing such that our customers can proactively use the data we send to Azure, including "Azure Archive". With this new Archive Storage offering, Microsoft again makes significant enhancements to their Azure offering and we expect that this service will be an important driver of new and expanding opportunities for both Commvault and Microsoft.

imageNetApp® AltaVaultTM cloud-integrated storage enables customers to tap into cloud economics and securely backup data to Microsoft Azure cloud storage at up to 90% lower cost compared with on-premises solutions. AltaVault’s modern storage architecture optimizes data using class-leading deduplication, compression, and encryption. Optimized data is written to Azure blob storage, reducing WAN bandwidth requirements and ensuring maximum data security. By adding Day 1 support for Azure Archive storage, AltaVault provides organizations access to the most cost-effective Azure blob storage tier, significantly driving down costs for rarely accessed long term backup and archive datasets. Try AltaVault’s free 90-day trial and see how easy it is to leverage Microsoft Azure Archive cloud storage today.

imageHubStor is a cloud archiving platform that converges long-term retention and data protection for on-premises file servers, Office 365, email, and other sources of unstructured data content. Delivered as Software-as-a-Service (SaaS) exclusively on the Azure cloud platform, IT teams are adopting HubStor to understand, secure, and manage large volumes of data in Azure with policies for classification, indexing, WORM retention, deletion, and tiering. As detailed in this post, customers can now apply HubStor’s built-in file analytics and storage tiering policies with the new Azure Archive Blob Storage tier to place the right data on the optimal tier at the best time in the information lifecycle. Enterprise Strategy Group recently completed a lab validation report on HubStor which you can download here.

imageThe purpose of CloudBerry Backup for Microsoft Azure is automating data upload to Microsoft Azure cloud storage. It is able to compress and encrypt the data with a user-defined password prior to the upload. It then securely transfers it to the cloud either on schedule or in real time. CloudBerry Backup also comes with file-system and image-based backup, SQL Server and MS Exchange support, as well as flexible retention policies and incremental backup. CloudBerry Backup now supports Microsoft Azure Archive Blob Storage for storing backup and archival data.

imageArchive2Azure, the intelligent data management and compliance archiving solution, provides customers a native Azure archiving application. Archive2Azure enables companies to provide automated retention, indexing on demand, encryption, search, review, and production for long term archiving of their compliance, active, low-touch, and inactive data from within their own Azure tenancy. This pairing of the Azure Cloud with Archive2Azure’s archiving and data management capabilities provides companies with the cloud-based security and information management they have long sought. With the general availability of Azure’s much anticipated Archive Storage offering, the needed security and lower cost to archive and manage data for extended periods is now possible. With the availability of the new Archive Storage offering, Archive2Azure can now offer Azure’s full range of storage tiers providing users a wide choice of storage performance and cost.

image[Archive support coming soon] Cohesity delivers the world’s first hyper-converged storage system for enterprise data. Cohesity consolidates fragmented, inefficient islands of secondary storage into an infinitely expandable and limitless storage platform that can run both on-premises and in the public cloud. Designed with the latest web-scale distributed systems technology, Cohesity radically simplifies existing backup, file shares, object, and dev/test storage silos by creating a unified, instantly-accessible storage pool. The Cohesity platform will support Azure Archive Storage for the following customer use cases: (i) long-term data retention for infrequently accessed data that require cost effective lowest priced blob storage, (ii) blob-level tiering functionality among Hot, Cool and Archive tiers, and (iii) ease of recovery of data from cloud back to on-premise independent of which Azure blob tier the data is in. Note that Azure Blob storage can be easily registered and assigned via Cohesity’s policy-based administration portal to any data protection workload running on the Cohesity platform.

imageIgneous Systems delivers the industry’s first secondary storage system built to handle massive file systems. Offered as-a-Service and built using a cloud-native architecture, Igneous Hybrid Storage Cloud provides a modern, scalable approach to management of unstructured file data across datacenters and public cloud, without the need to manage infrastructure. Igneous supports backup and long-term archiving of unstructured file data to Azure Archive Blob Storage, enabling organizations to replace legacy backup software and targets with a hybrid cloud approach.

image[Archive support coming soon] Rubrik orchestrates all critical data management services – data protection, search, development, and analytics – on one platform across all your Microsoft applications. By adding integration with Microsoft Azure Archive Storage Tier, Rubrik will complete support for all storage classes of Azure. With Rubrik, enterprises can now automate SLA compliance to any class in Azure with one policy engine and manage all archival locations in a single consumer-grade interface to meet regulatory and legal requirements. Leverage a rich suite of API services to create custom lifecycle management workflows across on-prem to Azure. Rubrik Cloud Data Management was architected from the beginning to deliver cloud archival services with policy-driven intelligence. Rubrik has achieved Gold Cloud Platform competency and offers end-to-end coverage of Microsoft technologies and services (physical or virtualized Windows, SQL, Hyper-V, Azure Stack, and Azure).

New Azure management and cost savings capabilities

The content below is taken from the original ( New Azure management and cost savings capabilities), to continue reading please visit the site. Remember to respect the Author & Copyright.

Enterprise customers choose Azure because of the unique value it provides as a productive, hybrid, intelligent and trusted cloud. Today I’m excited to announce four new management and cost savings capabilities. Azure Policy, now in public preview, provides control and governance at scale for your Azure resources. Azure Cost Management is rolling out the support for Azure Virtual Machine Reserved Instances management later this week to help you maximize savings over time.. To continue our commitment to making Azure cost-effective, we are reducing the prices of up to 4% on our Dv3 Series in several regions in the coming days, and making our lowest priced Storage tier Azure Archive Storage generally available today.

Simple ways to ensure a secure and well-managed cloud infrastructure

Azure is committed to providing a secure cloud foundation, while making available a comprehensive set of services to ensure that your cloud resources are secure and well-managed. Cloud security and management is a joint responsibility between Microsoft and the customer. We recommend that customers follow secure and well-managed cloud best practices for every production virtual machine. To help you achieve this goal, Azure has built-in services that can be configured quickly, are always up to date and are tightly integrated into the Azure experience. Take advantage of Azure Security Center for security management and threat protection, back up data to protect against ransomware and human errors with Azure Backup, and keep your applications running with Azure Monitor and Log Analytics. Check out the new poster that describes the Azure security and operations management services.

Enterprise customers have asked for better ways to help them manage and secure cloud resources at scale to accelerate cloud adoption. Azure Policy allows you to turn on built-in policies or build your own custom policies to enable company-wide governance. For example, you can set your security policy for your production subscription once and apply that policies to multiple subscriptions. I am happy to announce that Azure Policy is now in public preview.

Most value for every cloud dollar spent

With Azure Cost Management, Azure is the only platform that offers an end-to-end cloud cost management and optimization solution to help customers make the most of cloud investment across multiple clouds. Cost Management is free to all customers to manage their Azure spend. We are continuing to invest in bringing new capabilities to Cost Management. I am excited to announce that Cost Management supports Azure Reserved Virtual Machine Instances management starting December 15th.

In Azure, we have a long standing promise of making our prices comparable with AWS on commodity services such as compute, storage, and bandwidth. In keeping with this commitment, we are happy to announce price reductions of up to 4% on our latest general-purpose virtual machines, Dv3 Series in US West 2, US East and Europe North. These prices will take effect on January 5th.

We often hear customers are looking to the cloud for cost-effective ways to manage and store their infrequently accessed data for use cases like backup and archiving. Today, we’re announcing general availability of Azure Archive Storage, our lowest priced Storage tier yet. You can learn more details here.

Azure is the most cost-effective cloud for Windows Server workloads. If you are a Windows Server customer with Software Assurance, you can combine Azure Reserved Instances (RIs) with Azure Hybrid Benefits and save up to 82% compared to pay-as-you-go prices, and up to 67% compared to AWS RIs for Windows VMs. In addition, with Azure Hybrid Benefit for SQL Server, customers with Software Assurance will be able to save even more.

There are many other ways to save money with Azure. To learn more, check out the new Azure Cost Savings infographic below.


Azure provides the broadest set of security and management capabilities built into a public cloud platform. With these capabilities, customers can more easily secure and manage hybrid infrastructure resources while achieving significant cost savings. Activate Security Center, Backup, Log Analytics and Cost Management today to ensure a secure and well-managed cloud infrastructure with optimized efficiency.

Lack of Migration Tools Can Cause Problems Moving to Teams

The content below is taken from the original ( Lack of Migration Tools Can Cause Problems Moving to Teams), to continue reading please visit the site. Remember to respect the Author & Copyright.

No-one Likes Migrations but Everyone Loves Teams

Let’s say that you decide to embrace Microsoft Teams and move some email traffic to the new platform. Or that you want to move away from a competing chat platform like Slack or HipChat because Teams is part of your Office 365 plan and better integrated with other Office 365 applications, or because Teams is taking over from Skype for Business Online. The question might then arise whether you need to move any information from your current platform to Teams.

In some cases, the answer is no, and you can start with a clean slate. Users finish up whatever they are working on with the old platform before moving to Teams. In other cases, the old platform holds corporate knowledge or other essential information (like records needed for compliance) that you must preserve before you can decommission that platform.

Moving Email to Teams

Email includes personal mailboxes, shared mailboxes, and site mailboxes. Because email exists alongside Teams, there is often no need to move anything unless you have an important message or attachment that must be in Teams. In this case, because you cannot drag and drop items from an email client into Teams, the easiest solution is to email it to the channel that you want it to be in.

Unless blocked by tenant settings, each channel has a unique email address in the form [email protected]. To get the address, click the ellipsis menu for the channel. You can then copy the email address revealed by Teams (Figure 1) and use it to send whatever information you want to the channel. The technique works for any application capable of sending email via SMTP.

Teams email address

Figure 1: Email address for a channel (image credit: Tony Redmond)

Teams stores copies of received messages and any attachments in the Email Messages folder for the channel in the SharePoint document library used by the team. Emailing individual items is tiresome if you must process more than a few items, but it is effective.

Moving Documents to Teams

The thing to remember about documents is that Teams uses SharePoint for its document management. Each team has a SharePoint team site with a document library and each channel in the team has a separate folder. Anything you can do to move documents around for SharePoint applies to Teams.

You can use email documents to move them to Teams, but if the documents are already in a SharePoint library, it is better to create a tab to bring people to the library. If the team members have access to the target library, they can work with the files stored there through Teams.

First, get the URL for the target library by accessing it with a browser and copying the URL. Then, create a new SharePoint tab and insert the link you copied (Figure 2). Give the tab a name that tells users what the library holds.

Teams SharePoint Tab

Figure 2: Linking Teams to a SharePoint Library (image credit: Tony Redmond)

Linking a team to an existing SharePoint library might be a good way to move away from the now-deprecated site mailboxes. That is, unless you need offline access.

To move documents from other SharePoint libraries to those used by Teams, you can use SharePoint’s Move function. However, if you have hundreds of documents to move, it might be easier to synchronize both libraries with the OneDrive sync client and then copy whatever you need from the target library to the Teams library.

If you want to move documents from file servers or SharePoint on-premises servers, a range of third-party tools are available from ISVs such as ShareGate, Metalogix, and AvePoint. Alternatively, Microsoft announced their own SharePoint Migration Tool at Ignite 2017 (the tool is still in preview).

Moving Conversations to Teams

Apart from Teams, tenants can use Office 365 Groups and Yammer Groups for collaboration. Apart from emailing selected items, there is no way to move conversations hosted by these platforms to Teams.

Moving from Other Chat Platforms to Teams

Those who want to move information from other chat platforms might be out of luck. Although it is possible to extract data from platforms like HipChat and Slack, the issue is how to import that data into Teams. Documents are self-contained and can go into SharePoint; messages are a different matter.

To date, Microsoft has not created an API to import messages into Teams, perhaps because of the problems involved in taking information from different platforms and bringing that data into Teams in a way that the data is useful.

Processing imported can be complex. For instance, assume you export messages from another platform. Unless you want to dump the messages as individual items into Teams, you might want to check date formats, connect the messages belonging to a conversation together so that Teams displays them as a conversation, and matched user names against Azure Active Directory and team membership. This issue is not unique to Teams as some fix-up processing is usually needed whenever you move data between platforms.

The DIY Approach

Some light might be on the horizon in a GitHub project called “Channel Surf,” intended to help companies move from Slack to Teams. Although the author (Tam Huynh) is a Microsoft employee, this is a community project. As Tam explains in a post in the Microsoft Tech Community, the aim is to allow you to recreate an existing Slack channel structure in Teams.

The basic idea is that you create a Slack archive and use the data in the archive to recreate the channels in Teams. To populate the channel, the code generates HTML files for the Slack messages and copies them to Teams. Any code also copies any attachments found in Slack to Teams.

Tam notes that he used public APIs in the project and that some functions needed to perform a true message import are not yet available. However, it is a work in progress that anyone can get involved with to improve.

No ISVs Support Teams Migration

Many ISVs offer migration products for Office 365 and it is surprising that none (that I can find) have any solution for Teams migration more than a year after Teams appeared in preview. The likely reason is the lack of a supported public API to allow ISVs to perform the necessary fix-up when moving data from different platforms into Teams.


Migration Might Not be a Road Block

Although no migration tools are available for Teams today, I do not think this is a road block for deployment. You can move high-value items like documents into SharePoint relatively easily and email individual messages if necessary. In the absence of tools to move conversations from Slack or other chat platforms, you can either wait for the market to mature and migration tools to appear or consider a cutover migration. It’s an imperfect situation for now.

Follow Tony on Twitter @12Knocksinna.

Want to know more about how to manage Office 365? Find what you need to know in “Office 365 for IT Pros”, the most comprehensive eBook covering all aspects of Office 365. Available in PDF and EPUB formats (suitable for iBooks) or for Amazon Kindle.

The post Lack of Migration Tools Can Cause Problems Moving to Teams appeared first on Petri.

A humanoid robot carried the Olympic torch in South Korea

The content below is taken from the original ( A humanoid robot carried the Olympic torch in South Korea), to continue reading please visit the site. Remember to respect the Author & Copyright.

One of the traditions of the Olympics is the torch relay, in which people carry the flame from Olympia, Greece to the location of the Games. In 2018, the Olympic Games will be held in Pyeongchang, South Korea, and the torch relay is currently underwa…