Magazine: The Microsoft Journal For Developers

Download as pdf or txt
Download as pdf or txt
You are on page 1of 108

THE MICROSOFT JOURNAL FOR DEVELOPERS AUGUST 2017 VOL 32 NO 8

magazine

Visual Studio Extensions......8

0817msdn_CoverTip_8x10.75.indd 1 7/11/17 1:30 PM


0717msdn_CoverTip_8x10.75.indd 2 6/6/17 12:21 PM
THE MICROSOFT JOURNAL FOR DEVELOPERS AUGUST 2017 VOL 32 NO 8

magazine

Visual Studio Extensions......8

Creating Extensions for Multiple COLUMNS


Visual Studio Versions EDITOR’S NOTE
Carlos Quintero. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 MEAN Machine
Michael Desmond, page 4
How Xamarin.Forms Customization UPSTART
Took an FAA Drone App Higher 3 Demands: Mastering
Dan Hermes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 the Job Hunt
Krishnan Rangachari, page 6

Git Internals: Architecture and Index Files TEST RUN


Jonathan Waldman. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 Deep Neural Network IO
Using C#
Actionable Messages for Outlook James McCaffrey, page 58
Woon Kiat Wong. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 THE WORKING
PROGRAMMER
Batch Processing Using a How To Be MEAN:
Serverless Architecture Up-Angular-izing
Ted Neward, page 66
Joseph Fultz. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
ESSENTIAL .NET
C# 7.0: Tuples Explained
Mark Michaelis, page 72

DON’T GET ME STARTED


Salt and Pepper
David Platt, page 80

0817msdn_C1_v1.indd 1 7/12/17 11:54 AM


Write Fast, Run Fast
with Infragistics Ultimate Developer Toolkit
Includes 100+ beautiful, fast grids, charts, and other UI controls, plus
productivity tools for quickly building high-performing web, mobile,
and desktop apps

Featuring
� Xamarin UI controls with innovative, code-generating productivity tools
� JavaScript/HTML5 and ASP.NET MVC components, with support for:

Also includes controls for WPF, Windows Forms, and ASP.NET,


plus prototyping, remote usability testing, and more.

Get started today with a free trial,


reference apps, tutorials, and eBooks at
Infragistics.com/Ultimate

To speak with sales or request a product demo with a solutions consultant call 1.800.231.8588

Untitled-6 2 7/6/17 3:41 PM


Untitled-6 3 7/6/17 3:42 PM
magazine

AUGUST 2017 VOLUME 32 NUMBER 8
ID STATEMENT MSDN Magazine (ISSN 1528-4859) is
published 13 times a year, monthly with a special issue in
November by 1105 Media, Inc., 9201 Oakdale Avenue,
General Manager Jeff Sandquist Ste. 101, Chatsworth, CA 91311. Periodicals postage paid
at Chatsworth, CA 91311-9998, and at additional mailing
Director Dan Fernandez offices. Annual subscription rates payable in US funds
Editorial Director Mohammad Al-Sabt [email protected] are: U.S. $35.00, International $60.00. Annual digital
subscription rates payable in U.S. funds are: U.S. $25.00,
Site Manager Kent Sharkey International $25.00. Single copies/back issues: U.S. $10,
Editorial Director, Enterprise Computing Group Scott Bekker all others $12. Send orders with payment to: MSDN
Magazine, P.O. Box 3167, Carol Stream, IL 60132, email
Editor in Chief Michael Desmond [email protected] or call (847) 763-9560.
POSTMASTER: Send address changes to MSDN
Features Editor Sharon Terdeman
Magazine, P.O. Box 2166, Skokie, IL 60076. Canada
Features Editor Ed Zintel Publications Mail Agreement No: 40612608. Return
Undeliverable Canadian Addresses to Circulation Dept.
Group Managing Editor Wendy Hernandez or XPO Returns: P.O. Box 201, Richmond Hill,
Senior Contributing Editor Dr. James McCaffrey ON L4B 4R5, Canada.

Contributing Editors Dino Esposito, Frank La Vigne, Julie Lerman, Mark Michaelis, Printed in the U.S.A. Reproductions in whole or part
Ted Neward, David S. Platt prohibited except by written permission. Mail requests
to “Permissions Editor,” c/o MSDN Magazine, 4 Venture,
Vice President, Art and Brand Design Scott Shultz Suite 150, Irvine, CA 92618.
Art Director Joshua Gould
LEGAL DISCLAIMER The information in this magazine
has not undergone any formal testing by 1105 Media,
Inc. and is distributed without any warranty expressed
LEAD SERVICES or implied. Implementation or use of any information
contained herein is the reader’s sole responsibility. While
Vice President, Lead Services Michele Imgrund the information has been reviewed for accuracy, there
President Senior Director, Audience Development is no guarantee that the same or similar results may be
Henry Allain & Data Procurement Annette Levee achieved in all environments. Technical inaccuracies may
result from printing errors and/or new developments
Chief Revenue Officer Director, Audience Development in the industry.
Dan LaBianca & Lead Generation Marketing Irene Fincher
Director, Client Services & Webinar CORPORATE ADDRESS 1105 Media, 9201 Oakdale Ave.
Chief Marketing Officer Production Tracy Cook
Ste 101, Chatsworth, CA 91311 www.1105media.com
Carmel McDonagh
Director, Lead Generation Marketing Eric Yoshizuru MEDIA KITS Direct your Media Kit requests to Chief
Director, Custom Assets & Client Services Mallory Bastionell Revenue Officer Dan LaBianca, 972-687-6702 (phone),
ART STAFF 972-687-6799 (fax), [email protected]
Senior Program Manager, Client Services
Creative Director Jeffrey Langkau
& Webinar Production Chris Flack REPRINTS For single article reprints (in minimum
Associate Creative Director Scott Rovin quantities of 250-500), e-prints, plaques and posters
Project Manager, Lead Generation Marketing
Senior Art Director Deirdre Hoffman Mahal Ramos
contact: PARS International Phone: 212-221-9595.
E-mail: [email protected].
Art Director Michele Singh www.magreprints.com/QuickQuote.asp
Art Director Chris Main MARKETING
LIST RENTAL This publication’s subscriber list, as well as
Senior Graphic Designer Alan Tao Chief Marketing Officer Carmel McDonagh
other lists from 1105 Media, Inc., is available for rental.
Senior Web Designer Martin Peace Vice President, Marketing Emily Jacobs For more information, please contact our list manager,
Marketing & Editorial Assistant Megan Burpo Jane Long, Merit Direct. Phone: 913-685-1301;
E-mail: [email protected];
PRODUCTION STAFF
Web: www.meritdirect.com/1105
Print Production Coordinator Lee Alexander ENTERPRISE COMPUTING GROUP EVENTS
Reaching the Staff
Vice President, Events Brent Sutton
Staff may be reached via e-mail, telephone, fax, or mail.
ADVERTISING AND SALES Senior Director, Operations Sara Ross E-mail: To e-mail any member of the staff, please use the
Chief Revenue Officer Dan LaBianca Senior Director, Event Marketing Merikay Marzoni following form: [email protected]
Irvine Office (weekdays, 9:00 a.m. – 5:00 p.m. PT)
Regional Sales Manager Christopher Kourtoglou Events Sponsorship Sales Danna Vedder Telephone 949-265-1520; Fax 949-265-1528
4 Venture, Suite 150, Irvine, CA 92618
Advertising Sales Associate Tanya Egenolf Senior Manager, Events Danielle Potts Corporate Office (weekdays, 8:30 a.m. – 5:30 p.m. PT)
Coordinator, Event Marketing Michelle Cheng Telephone 818-814-5200; Fax 818-734-1522
9201 Oakdale Avenue, Suite 101, Chatsworth, CA 91311
ONLINE/DIGITAL MEDIA Coordinator, Event Marketing Chantelle Wallace The opinions expressed within the articles and other
Vice President, Digital Strategy Becky Nagel contentsherein do not necessarily express those of
the publisher.
Senior Site Producer, News Kurt Mackie
Senior Site Producer Gladys Rama
Site Producer Chris Paoli Chief Executive Officer
Site Producer, News David Ramel Rajeev Kapur
Director, Site Administration Shane Lee Chief Operating Officer
Front-End Developer Anya Smolinski Henry Allain
Junior Front-End Developer Casey Rysavy Chief Financial Officer
Executive Producer, New Media Michael Domingo Craig Rucker
Office Manager & Site Assoc. James Bowling Chief Technology Officer
Erik A. Lindgren
Executive Vice President
Michael J. Valenti
Chairman of the Board
Jeffrey S. Klein

2 msdn magazine

0817msdn_Masthead_v1_2.indd 2 7/13/17 9:34 AM


Untitled-7 1 6/30/17 1:50 PM
Editor’s Note MICHAEL DESMOND

MEAN Machine

It was August 2015 when Ted Neward officially settled down. Neward solve thorny problems that seem to bedevil other platforms, the
for years has used his column, The Working Programmer, as a plat- bill eventually comes due.
form to cover everything and anything—from his 10-part opus on “When you’re deeper in, you discover that what was easy on your
Multiparadigmatic .NET development, to his experiment with an old platform—like .NET or JVM or whatever—is not so easy here,
ELIZA-like intelligent conversation bot (this back in 2012, mind and you’re forced to sit back in your chair and go, ‘Huh.’”
you), to his work with alternative frameworks and databases like
Oak, Cassandra and MongoDB. And don’t even get me started on
his brilliant LOLCODE column (msdn.com/magazine/dn166934)—I’m “I originally thought this would
still chuckling about that.
But it was two years ago this month that Neward—accidentally, maybe be a six- to nine-piece
it turns out—settled on a single topic. Since his August 2015 column,
“How To Be MEAN: Getting Started” (msdn.com/magazine/mt185576), series, and then we’d move on to
Neward has been exploring the popular MEAN stack, consisting of
MongoDB, Express, Angular and Node.js. That article was intended some other things.”
to be the first of perhaps half a dozen columns exploring MEAN.
Now, 24 months later, Neward is still at it. And judging by the online Ted Neward, MSDN Magazine Columnist
traffic his columns are generating, he could go another 24 months.
“I originally thought this would maybe be a six- to nine-piece
series, and then we’d move on to some other things,” Neward says. At the end of the day, MEAN is just an architectural stack, much
“In fact, I have a number of .NET-centric ideas waiting in the wings, like LAMP (Linux, Apache, MySQL, PHP) before it, and Neward
including a piece or two on static code analyzers, which is about cautions against ascribing too much to it.
as far from the world of dynamic, typeless, JavaScript-y program- “We talked about this a year or so ago: You can build an ASP.NET
ming as you can get.” WebAPI + CouchDB + ReactJS stack, and call it ‘ARC,’ if you like, and
Neward describes writing about the MEAN stack as “both tricky have just as much success with it as you can with MEAN.”
and rewarding,” with frequent, major updates unveiling green fields Neward should know. He’s been having success with MEAN for
to explore. This was particularly true of the Angular 2 release in two years now, and he’s not done yet. In upcoming issues, he says
September 2016, which Neward describes as a “complete transfor- he plans to dive into Angular Routing, and from there into testing
mation,” but he also singles out changes to both TypeScript and and forms.
the ECMAScript language. And while Neward says MEAN can
Visit us at msdn.microsoft.com/magazine. Questions, comments or suggestions for MSDN Magazine? Send them to the editor: [email protected].

© 2017 Microsoft Corporation. All rights reserved.


Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, you are not permitted to reproduce, store, or introduce into a retrieval system MSDN Magazine or any part of MSDN
Magazine. If you have purchased or have otherwise properly acquired a copy of MSDN Magazine in paper format, you are permitted to physically transfer this paper copy in unmodified form. Otherwise, you are not permitted to transmit
copies of MSDN Magazine (or any part of MSDN Magazine) in any form or by any means without the express written permission of Microsoft Corporation.
A listing of Microsoft Corporation trademarks can be found at microsoft.com/library/toolbar/3.0/trademarks/en-us.mspx. Other trademarks or trade names mentioned herein are the property of their respective owners.
MSDN Magazine is published by 1105 Media, Inc. 1105 Media, Inc. is an independent company not affiliated with Microsoft Corporation. Microsoft Corporation is solely responsible for the editorial contents of this magazine. The
recommendations and technical guidelines in MSDN Magazine are based on specific environments and configurations. These recommendations or guidelines may not apply to dissimilar configurations. Microsoft Corporation does not make
any representation or warranty, express or implied, with respect to any code or other information herein and disclaims any liability whatsoever for any use of such code or other information. MSDN Magazine, MSDN and Microsoft logos are
used by 1105 Media, Inc. under license from owner.

4 msdn magazine

0817msdn_DesmondEdNote_v3_4.indd 4 7/12/17 11:55 AM


magazine VP MSDN MAGAZINE VENDOR PROFILE
MSDN Magazine Vendor Profile

ActivePDF provides developers and IT professionals with the capability to move closer to a fully
functional digital environment. Digitalize your organization by integrating PDF manipulation
and automation into your business process workflow. Join the PDF Revolution!

ActivePDF Family of Products


DocGenius™ Developer tools for creating software applications that have embedded PDF functionality.
Toolkit Xtractor WebGrabber Server
Scalable & flexible PDF Search PDF to retrieve or Server-Based HTML-to-PDF PDF generation for legacy
processing library extract text & images conversion software

DocSight™ Unattended server applications that enable creation, conversion and other PDF functionality.
OCR DocConverter Meridian
Image conversion of PDF files High-volume document Network PDF printer for
to searchable text conversion unlimited users

DocSpace™ Server applications with user interfaces for viewing, creating and interacting with PDF.
ReaderPlus
Browser-based PDF viewer
and editor Download your FREE 30-day trial g www.ActivePDF.com
Toll Free US: 866 468 6733 | Outside US: +1 949 582 9002

Untitled-3 1 7/7/17 3:53 PM


Upstart KRISHNAN RANGACHARI

3 Demands: Mastering the Job Hunt


When you’re job hunting, there are three demands that a potential Upgrading from a team lead to a senior manager or director is
employer might make: skills, experience and achievements. The relatively easy, especially if you’re at a big company. You can demand
more you can exceed expectations on these demands, the more a loftier title when you switch to a smaller company or a startup.
likely you’ll get the job. The smaller company appreciates your big-company experience,
maturity, expertise, and skillset, and you appreciate the upgraded
Skills title, the greater scope of your responsibilities, and the opportu-
Maybe all your experience is in the Microsoft .NET Framework nity for more direct impact.
and Azure, but a job you’re interested in is recruiting developers Now, if you have zero management experience and your goal is
for Java and a competing cloud platform. What do you do? to become a manager at a new job, it’s more challenging. That’s a
First, about 30 percent to 40 percent of technology companies little bit like a 12-year-old you’ve never met asking to babysit your
(including the likes of Facebook and Google) are technology-­ 1-year-old. The 12-year-old isn’t a known commodity, and neither
agnostic in their hiring. Even if you have zero experience in their are non-managers who want to get hired as managers.
primary stacks, they’ll hire you if you’re a good engineer; they’ll In such situations, it helps for you to become an “acting manager”
trust you to learn the tools and languages quickly. at your current role. This means you can manage interns, mentor
Second, another 20 percent to 30 percent of companies will junior- and mid-level engineers, and act as a project manager on
give “credit” for similar-enough technologies. For example, some some projects, as technical lead on others, and as design architect
shops on a non-Azure cloud will “honor” your Azure experience, on still others. This way, you don’t have to ask the companies you
and some Java shops will look favorably on your .NET experience. interview with to trust your “potential.” Instead, you can showcase
At the remaining 30 percent to 50 percent of companies—the your status as a manager in your responsibilities (if not in title) and
ones that are looking for specific stack experience—developers share five to 10 experiences to prove it.
make the mistake of trying to sell themselves as “quick learners.”
The problem is everybody claims to be a quick learner! Achievements
Instead, the trick is to gain and demonstrate some experience in Sometimes, it may feel like the world is being flooded with new
the stack in which you have zero experience. Your day job may be software engineers. How do you stand out in this virtually indis-
in .NET, but you can take on skunkworks or tool projects at work tinguishable ocean?
using other technologies. Typically, you have more technical wiggle While the number of software engineers keeps increasing, I
room with non-core projects. If you do this over a few months, observe—in my interactions with hundreds of software engineers
you can shift your resume from being 100 percent C# projects to, each year—that there’s still an extreme shortage of self-aware software
say, 80 percent C# and 20 percent Python. engineers with strong communication skills and good technical chops.
Then, in your interviews, if you get grilled on your stack expe- So, the more you invest in yourself as an engineer, the more you
rience, you can say, “My background is in C#. Over the last few stand out and live up to your own potential. This includes prac-
months, I’ve been using Python more and more in my projects, ticing mock technical interviews on sites like pramp.com, devouring
and I like it a lot. One of the things that’s attracting me to this role technical interview books and courses, investing in your own
is the opportunity to use Python even more.” coaching and personal development, and polishing your resume
This comes off as not only sincere, but its vulnerable nature protects and story-telling skills. It also includes mastering how to position
you. You aren’t claiming to be a Python god or goddess, but you’re yourself in every stage of the job hunt.
also not bemoaning your Python newbie status. Plus, you’ve weaved a Ultimately, you aren’t competing with an army of mediocre soft-
story of how your past experience leads into this future opportunity. ware engineers; in fact, the more mediocre engineers there are,
the easier it is to stand out! You’re competing with yourself to set
Experience extraordinarily high standards of performance, and pursue them
Sometimes, the whole point of a job switch is that you want a better with determination, dedication and enthusiasm. n
position than the one you’re in right now. In effect, you want to “up-level.”
But how do you upgrade from a manager position to a director one? Krishnan Rangachari helps brilliant developers have amazing careers. Visit
How do you get hired as a manager if you’ve never managed people? RadicalShifts.com for his free courses.
6 msdn magazine

0817msdn_RangaUpstart_v3_6.indd 6 7/12/17 11:57 AM


Untitled-5 1 9/28/15 4:25 PM
VISUAL STUDIO

Creating Extensions
for Multiple
Visual Studio Versions
Carlos Quintero

The release of a new version of Visual Studio is always a In this article, I’ll show you how to accomplish this. For this
challenge for developers of extensions (packages, add-ins, templates purpose, I’ll focus on the most common scenario: a package with
and so forth). For example, Visual Studio 2010 introduced the new a command, created in a managed language (C#, in this case) and
Visual Studio Installer for eXtensions (VSIX files); Visual Studio deployed as a VSIX file.
2012 introduced the light/dark themes; and Visual Studio 2015 The goals to be accomplished are the following:
removed add-ins (with the Add-In Manager); not to mention that •T  o use a single Visual Studio project to create the package.
each Visual Studio version provides a new SDK, new extensibility • To use Visual Studio 2017 for development and debugging.
assemblies and new APIs. With Visual Studio 2017, this challenge is •T  o generate a single package DLL as the result of the build.
even bigger, due to its new modular setup based on workloads and •T  o put that single DLL inside a single VSIX file.
individual components, and to a new version of the manifest for • To be able to install that VSIX file on Visual Studio 2017 and
the VSIX deployment mechanism. While some developers (most on many past versions (2015, 2013 and so on).
notably from Microsoft) release a different new extension for each Because two artifacts are needed—a DLL file (which is the package)
Visual Studio version, most would prefer to release a single updated and a VSIX file (which is the deployment vehicle for the package)—I’ll
extension that can target the widest range of Visual Studio versions. explain each of these separately: First, how they work at installation
or run time; second, how to develop them.
This article discusses:
• How Visual Studio extensions are created, deployed and installed The VSIX File
• How to create a single package for multiple Visual Studio versions As mentioned earlier, Visual Studio 2010 introduced the VSIX
• How to deploy a package with a single VSIX file
deployment mechanism to install Visual Studio extensions, and it’s
been the preferred way ever since. A VSIX file has the extension .vsix
Technologies discussed: and can be installed in different ways. If the VSIX file is published
Visual Studio 2012, 2013, 2015 and 2017, on the Visual Studio Marketplace (formerly Visual Studio Gallery)
Microsoft .NET Framework, VSIX files and it’s compatible with the Visual Studio version and edition you’re
Code download available at: using, you can install it using the Extensions and Updates dialog.
msdn.com/magazine/0817magcode
Under the Tools menu, click on Extensions and Updates and then
go to Online | Visual Studio Marketplace (see Figure 1).
8 msdn magazine

0817msdn_QuinteroExt_v3_8-16.indd 8 7/12/17 11:56 AM


As you can see, the manifest states
in the InstallationTarget XML ele-
ment the supported Visual Studio
editions. Here, the Microsoft.Visual­
Studio.Pro value targets the Profes-
sional edition and higher, such as
the Premium, Ultimate, Enterprise
and any other such editions. Note
that it also targets the Community
edition, which is basically a Profes-
sional edition with some licensing
restrictions and without some
features. It also states the range of
supported Visual Studio versions:
10.0 (2010), 11.0 (2012), 12.0 (2013),
Figure 1 The Extensions and Updates Dialog Window
14.0 (2015), 15.0 (2017).
When the VSIX file of a per-user
You can also double-click a VSIX file. When this happens, a extension is installed (either by Visual Studio or by the VSIX
Visual Studio Launcher (C:\Program Files (x86)\Common Files\ Installer), the files inside are unzipped and copied to a random
Microsoft Shared\MSEnv\VSLauncher.exe)  associated to the folder at this location: C:\Users\<user>\AppData\Local\Microsoft\
.vsix file extension is executed; this locates the VSIXInstaller.exe VisualStudio\<version number>\Extensions\<random folder>.
utility of the highest installed Visual Studio version (the highest The <version number> can have an “Exp” suffix appended for the
version is required to be able to install to all lower versions). Then, “Experimental Instance” (explained later), and for Visual Studio 2017
the VSIX Installer shows the dialog in Figure 2 so you can select it will also include the “instance id” of the installed Visual Studio.
the compatible Visual Studio versions and editions in which to This instance id is randomly generated at Visual Studio install time; it
install the extension. was added to support side-by-side installations of different editions
VSIX files can be installed programmatically, too, using the of the same version (2017) of Visual Studio, something that wasn’t
VSIXInstaller.exe utility with its command-line options, such as possible before. For machine-wide extensions, the subfolder
the target Visual Studio version (2017, 2015 and so on) and edition Common7\IDE\Extensions is used. Notice that in any case each
(Community, Professional and the like). You can find that utility in Visual Studio version uses its own folder for its extensions.
the Common7\IDE subfolder of your Visual Studio installation. While it would be nice if all Visual Studio versions supported the
In any case, either Visual Studio or the VSIXInstaller.exe utility same manifest format, unfortunately that’s not the case. Visual Studio
needs to know which Visual Studio versions and editions the VSIX 2010 introduced VSIX and the first version of the manifest. Visual Studio
file supports. That information can be discovered via a manifest 2012 introduced version 2, which is completely different and incompat-
file inside the file. The VSIX file is actually a .zip file, so you can ible with version 1. However, Visual Studio 2012, 2013 and 2015—all of
rename its .vsix file extension to .zip and then open it to examine which support version 2—can still accept a version 1 manifest, so you
its contents (see Figure 3). can build a VSIX file with a version 1 and target from Visual Studio
As you can see, there are several files inside: The .dll file is the 2010 to Visual Studio 2015. But Visual Studio 2017 supports neither
package DLL. The .pkgdef file is used at installation time to add version 1 nor version 2. Instead, it requires a third version of the
some keys to the Windows Registry that allows Visual Studio to manifest. Fortunately, version 3 keeps using the value “2.0.0.0” in the
recognize the DLL as a package. The Version attribute of the PackageMan-
[Content_Types].xml file describes ifest XML element and it adds only an
the content type for each file exten- XML element named <Prerequisites>
sion (.dll, .json and so forth). The cata­ (and the two new files, catalog.json and
l­og.json and manifest.json files are manifest.json, into the VSIX file). So,
required by Visual Studio 2017. And the it’s completely backward-compatible
exten­sion.vsixmanifest file describes the with the second version, supported by
name of the extension, version, Visual Studio 2012, 2013 and 2015 (but
and more, and which Visual Studio not by Visual Studio 2010, which only
versions and editions it supports. supports version 1). This means that you
You can unzip the extension.vsix- can’t target Visual Studio 2010-2017 with
manifest file and open it with a text a single VSIX file. From this point, I’ll
editor to examine its contents, which give up on Visual Studio 2010 and will
will look similar to what’s shown in continue with a VSIX file that supports
Figure 4. Figure 2 The VSIX Installer Visual Studio 2012, 2013, 2015 and 2017.
msdnmagazine.com August 2017 9

0817msdn_QuinteroExt_v3_8-16.indd 9 7/12/17 11:56 AM


all use CLR 4.0, the CLR version is not a problem when the DLL of
an extension targets Visual Studio 2012, 2013, 2015 and 2017.
Libraries constitute the second part of a .NET Framework;
these are DLLs referenced by a Visual Studio project and used at
run time. To develop a single extension that targets multiple ver-
sions of Visual Studio, you must use the highest .NET Framework
installed by default by the lowest Visual Studio version that you
want to target. This means that if you want to target Visual Studio
2012 and higher, you need to use .NET Framework 4.5. You can’t
Figure 3 Contents of a VSIX File
use, say, .NET Framework 4.5.1 introduced by Visual Studio 2013,
because any DLL introduced in that version would not be present
The Package DLL on a computer with only Visual Studio 2012 installed. And unless
A managed Visual Studio package is a DLL that contains a class that you really need that DLL, you won’t want to force such users to
inherits from Microsoft.VisualStudio.Shell.Package. It’s decorated install .NET Framework 4.5.1 to use your extension (it could hurt
with certain attributes that help at build time to generate a .pkgdef sales or downloads and support).
file (which, as mentioned earlier, you can find inside the VSIX file
and in the installation folder of the extension). The .pkgdef file is
used at startup (older versions of Visual Studio) or at installation The release of a new version
time (version 15.3 of Visual Studio 2017) to register the DLL as a
package for Visual Studio. Once it’s registered, Visual Studio will of Visual Studio is always a
try to load the package at some point, either on startup or when
one of its commands is executed if the package uses delay loading challenge for developers
(which is the best practice). During the attempt to load the man-
aged DLL and initialize the package, three things happen: the DLL of extensions.
will be loaded by the Common Language Runtime (CLR) of a
Microsoft .NET Framework version; it will use some DLLs pro-
vided by a .NET Framework; and it will use some DLLs provided The extension also needs DLLs that are provided by Visual Studio
by Visual Studio. I will examine each of these in turn. (typically named Microsoft.VisualStudio.*). At run time, Visual
A .NET Framework is the sum of two things: The CLR + libraries Studio finds its DLLs at some well-known locations, such as the
(both base class and additional libraries). The CLR is the runtime (the folder Common7\IDE with its subfolders Common7\IDE\Public­
JIT compiler, garbage collector and so forth) and it loads managed Assemblies and Common7\IDE\PrivateAssemblies, and from the
DLLs. In the distant past, each .NET Framework version 1.0, 1.1 and Global Assembly Cache (GAC). The GAC for .NET Framework 4.x
2.0 (used by Visual Studio.NET 2002, Visual Studio.NET 2003 and is located at C:\Windows\Microsoft.NET\assembly (there’s another
Visual Studio 2005) provided its own CLR version (1.0, 1.1 and 2.0). GAC at C:\Windows\assembly, but that one is for older .NET
However, the .NET Frameworks 3.0 and 3.5, used by Visual Studio Frameworks). Visual Studio 2017 uses a more isolated installation that
2008, continued to use the exact same CLR 2.0 of .NET Framework avoids the GAC, relying instead on the folders described previously.
2.0, instead of introducing a new one. Visual Studio 2010 intro- There are a couple of key principles to follow when developing
duced .NET Framework 4 and CLR 4.0, but since then all new .NET and generating a VSIX file: You must use the versions provided by
Frameworks 4.x have used CLR 4.0 (although swapping it “in-place” the lowest Visual Studio version your extension targets. That means
with a backward-compatible version rather than reusing the exact that if you want to target Visual Studio 2012 and higher, you must
CLR 4.0 of .NET Framework 4). Since Visual Studio 2012 and higher use only assemblies and extensibility APIs provided by that ver-

Figure 4 The Contents of a Manifest File


10 msdn magazine Visual Studio

0817msdn_QuinteroExt_v3_8-16.indd 10 7/12/17 11:56 AM


Untitled-6 1 3/6/17 2:32 PM
sion (or lower). If your extension uses a DLL introduced by Visual Studio 2017 with a manifest that targets Visual Studio versions
Studio 2013 or higher, the extension won’t work on a machine with from 12.0 to 15.0; it will contain a package and a command; and it
only Visual Studio 2012. The second principle is that the extension will use only references with version 11.0.0.0 (or lower) installed
never must deploy Visual Studio DLLs, neither to the locations I by Visual Studio 2012.
mentioned (folders of Visual Studio or GAC), nor to the installa- You might wonder at this moment which Visual Studio versions
tion folder of the extension. These DLLs are provided by the target should be installed on your development machine. The best prac-
Visual Studio, which means that the VSIX file shouldn’t include them. tice is to have two development machines as follows: On the first,
Many Visual Studio DLLs have a version number (8.0 … 15.0) if you have enough space on your disk, install all the Visual Studio
in the name, such as Microsoft.VisualStudio.Shell.11.0.dll or versions—2012, 2013, 2015 and 2017. They can all coexist side by side
Microsoft.VisualStudio.Shell.Immutable.10.0.dll. These help to iden- and you’ll be able to test them during development. For Visual Studio
tify the Visual Studio version that introduced them, but don’t get 2017, even different editions such as Community, Professional
fooled: it’s a name, not a version. For example, there are four ver- and Enterprise can coexist at the same time, something that wasn’t
sions (11.0.0.0, 12.0.0.0, 14.0.0.0 and 15.0.0.0) of Microsoft.Visual. possible with older versions of Visual Studio. If available space is
Studio.Shell.11.0.dll, each one provided, respectively, by a Visual a concern, install the minimal components for the old versions,
Studio version (2012, 2013, 2015 and 2017). The first three 11.0.0.0 or skip some version in the middle of the range (2013 or 2015).
to 14.0.0.0 are installed by the respective Visual Studio version in On your second development machine, install only Visual Studio
the GAC and the fourth version, 15.0.0.0, used by Visual Studio 2017 or, even better, a build server with no Visual Studio version
2017, is installed in the Common\IDE\PrivateAssemblies folder. installed (just the Build Tools 2017), to build your extension for
Because an extension that targets Visual Studio 2012 and release. This approach will help ensure that you’re not inadvertently
higher must use Visual Studio assemblies with version 11.0.0.0 using DLLs or other dependencies from folders installed by older
(the first principle mentioned earlier), this means that the reference Visual Studio versions. You might also wonder if it wouldn’t be safer
Microsoft.Visual.Studio.Shell.11.0.dll must be version 11.0.0.0. But to develop or build on a machine with only Visual Studio 2012
because that version isn’t installed by Visual Studio 2013 and higher installed and the answer is that it’s not possible: To generate a VSIX
(they start at version 12.0.0.0), and the extension shouldn’t deploy file for Visual Studio 2017 (which creates a version 3 manifest and
Visual Studio DLLs (the second principle), wouldn’t the extension adds the catalog.json and manifest.json files), you need the Visual
fail when trying to use that Visual Studio DLL? The answer is no, Studio SDK 15.0 of Visual Studio 2017 or, with some work, the Visual
and it’s thanks to an assembly-binding redirection mechanism Studio SDK 14.0 of Visual Studio 2015. Neither the Visual Studio
provided by the .NET Framework, which allows you to specify SDK 12.0 of Visual Studio 2013 nor the Visual Studio SDK 11.0 of
rules like “when something requests this version of an assembly, Visual Studio 2012 can generate VSIX files for Visual Studio 2017.
use this newer version of it.” Of course, the new version must be And the best practice for (serious) testing is: Use a separate
fully backward-compatible with the old version. There are several machine (virtual or cloud-based) for each Visual Studio version
ways to redirect assemblies from one version to another. One way (so you’ll need four machines to test your extension on Visual
is this: An executable (.exe file extension) can provide an accom- Studio 2012 to Visual Studio 2017 in isolation). This best practice
panying configuration file (.exe.config file extension) that speci- helped me to find some errors in the code sample for this article!
fies the redirections. So, if you go to the Common7\IDE folder of
your Visual Studio installation, you’ll find the devenv.exe execut-
able of Visual Studio, and a devenv.exe.config file. If you open the And the best practice for
.config file with a text editor, you’ll see that it contains lots of as-
sembly redirections: (serious) testing is: Use a
<dependentAssembly>
<assemblyIdentity
name="Microsoft.VisualStudio.Shell.11.0" separate machine (virtual or
publicKeyToken="b03f5f7f11d50a3a"
culture="neutral"/>
<bindingRedirect cloud-based) for each Visual
oldVersion="2.0.0.0-14.0.0.0
newVersion="15.0.0.0"/>
</dependentAssembly> Studio version.
So, Visual Studio 2017 (15.0) has an assembly version redirection
for Microsoft.VisualStudio.Shell.11.0 that states that whenever some- To get the Visual Studio 2017 project templates to create a package
thing requests old versions 2.0.0.0 to 14.0.0.0, use the new version (or any other kind of extension) you need the “Visual Studio exten-
15.0.0.0 instead. That’s how Visual Studio 2013 or later can use an sion development” workload. If you didn’t install it when you first
extension referencing Microsoft.VisualStudio.Shell.11.0 version installed Visual Studio 2017, go to the folder C:\Program Files (x86)\
11.0.0.0, even if they don’t provide that exact version. Microsoft Visual Studio\Installer, launch vs_Installer.exe, click the
Modify button and select that workload at the bottom of the list.
Developing the Extension Create a new VSIX project using the File | New | Project menu;
Now that you know how things work at run time, you can develop go to the Visual C# | Extensibility templates; ensure you’ve selected
the package. To recap, you’ll create a VSIX project using Visual .NET Framework 4.5 on the dropdown list at the top; and select the
12 msdn magazine Visual Studio

0817msdn_QuinteroExt_v3_8-16.indd 12 7/12/17 11:56 AM


Untitled-2 1 6/6/17 10:36 AM
VSIX Project template. Name the project VSIX-
ProjectVS2012_2017. Double-click the source.
extension.vsixmanifest file to open its custom
editor. In the Metadata tab, set the product name,
author, version and so on. In the Install Targets
tab, click the Edit button, select the Microsoft.
VisualStudio.Pro identifier (that value also tar-
gets the Community edition, which is basically
a Professional edition) and set the target instal-
lation range, [11.0,15.0], as shown in Figure 5.
A square bracket means the value is included.
Figure 5 Installation Targets
A parenthesis would mean that the value is
excluded, so you can also set [11.0,16.0). You can
also target a minor version (like 15.3) using the
build number (such as 15.0.26208.1).
In the Dependencies tab, delete all items. In
the Prerequisites tab, click the Edit button and
set the minimal Visual Studio 2017 component
your extension requires. In this example, only
the Visual Studio core editor is required. This
section is new for Visual Studio 2017 and the
version 3 manifest, so it only applies to version
15.0 (see Figure 6): Figure 6 Prerequisites
Add a package to the VSIX project by
right-clicking the VSIX project node in Solution Explorer, then In the project’s References node in Solution Explorer, uninstall
select the Add | New Item menu to bring up the Add New Item dia- all the remaining references (that weren’t acquired as NuGet pack-
log. Now, go to the Visual Studio C# Items | Extensibility | VSPackage ages) except System and System.Design. Now you can rebuild the
node, select the Visual Studio Package template and name it MyPack- solution. You’ll get compilation errors that will be solved adding
age.cs. Add a command to the package repeating the actions of just the references shown in Figure 7.
the previous step, but selecting this time the Custom Command Unfortunately, Microsoft doesn’t provide an official NuGet
template. Name this MyCommand1.cs. package for Microsoft.VisualStudio.Shell.11.0 (you can find an
To follow the principle of using the fewest dependencies required, unofficial NuGet VSSDK.Shell.11 package, though). If you have
in the source code of MyPackage.cs and MyCommand1.cs, remove Visual Studio 2012 installed (you should if that’s the minimal-­
the unused (grayed) namespaces. Then right-click the VSIX project supported version for your extension), you can get it from the
node in Solution Explorer and click the Manage NuGet Packages GAC as explained earlier. Alternatively, you can get all the required
for Solution entry. In the Installed section, uninstall all the pack- assemblies by installing the Visual Studio 2012 SDK (bit.ly/2rnGsfq)
ages in the order shown here: that provides them in the subfolders v2.0 and v4.0 of the folder
Microsoft.VisualStudio.Shell.15.0 C:\Program Files (x86)\Microsoft Visual Studio 11.0\VSSDK\­
Microsoft.VisualStudio.Shell.Framework VisualStudioIntegration\Common\Assemblies. The last column
Microsoft.VisualStudio.CoreUtility of the table shows the subfolder of the Visual Studio 2012 SDK
Microsoft.VisualStudio.Imaging where you can find each assembly.
Microsoft.VisualStudio.Shell.Interop.12.0 To avoid dependencies on unofficial NuGet packages or on
Microsoft.VisualStudio.Shell.Interop.11.0 specific local folders (either from a Visual Studio SDK or from a
Microsoft.VisualStudio.Shell.Interop.10.0
Microsoft.VisualStudio.Threading Figure 7 Visual Studio 2012 References
Microsoft.VisualStudio.Shell.Interop.9.0 Assembly Visual Studio 2012
Microsoft.VisualStudio.Shell.Interop.8.0 Assembly Name Version SDK Subfolder
Microsoft.VisualStudio.TextManager.Interop.8.0 Microsoft.VisualStudio.OLE.Interop 7.1.40304.0 v2.0
Microsoft.VisualStudio.Shell.Interop Microsoft.VisualStudio.Shell.Interop 7.1.40304.0 v2.0
Microsoft.VisualStudio.TextManager.Interop Microsoft.VisualStudio.Shell.Interop.8.0 8.0.0.0 v2.0
Microsoft.VisualStudio.Validation Microsoft.VisualStudio.Shell.Interop.9.0 9.0.0.0 v2.0
Microsoft.VisualStudio.Utilities
Microsoft.VisualStudio.Shell.Interop.10.0 10.0.0.0 v2.0
Microsoft.VisualStudio.OLE.Interop
Microsoft.VisualStudio.Shell.Immutable.10.0 10.0.0.0 v4.0
(Don’t uninstall the Microsoft.VSSDK.BuildTools package,
which is the Visual Studio SDK.) Microsoft.VisualStudio.Shell.11.0 11.0.0.0 v4.0

14 msdn magazine Visual Studio

0817msdn_QuinteroExt_v3_8-16.indd 14 7/12/17 11:56 AM


MSDN MAGAZINE VENDOR PROFILE

magazine

Instantly Search
Terabytes of Text
Executive Summary
• The dtSearch enterprise and developer product line
instantly searches terabytes of text, with no limit on the
number of concurrent search threads.
• dtSearch’s own document filters support a wide variety of
data formats, including “Office” files, PDFs, emails and
attachments, online data and other databases.
• The products offer over 25 hit-highlighted search options,
with special forensics search options and extensive
international language support.
• Developer products include faceted searching and multiple
other advanced data classification options.
• SDKs span a wide range of platforms, with APIs for .NET,
C++ and Java.

Key Benefits
Terabyte Indexer. dtSearch enterprise and developer products
can index over a terabyte of text in a single index, spanning
multiple directories, emails and attachments, online data and and negative variable term weighting and metadata ranking. For
other databases. dtSearch products can create and search any federated searching, products have integrated relevancy ranking
number of indexes, and can search indexes during updates. with multicolor hit-highlighting search options across both online
and offline data.
Concurrent, Multithreaded Searching. dtSearch developer
products support efficient multithreaded searching, with no limit Faceted Search and Other Search Results Filtering. The
on the number of concurrent search threads. dtSearch Engine supports user-interface-driven faceted or “drill
Document Filters and Supported Data Types. dtSearch’s own down” category searching, as well as numerous other full-text
document filters support “Office” documents, PDFs, compression and metadata classification options.
formats, emails and multilevel nested attachments, online data
SDKs. The dtSearch Engine offers .NET, Java and C++ APIs.
and other databases. Document filters are built into the product
dtSearch.com has extensive code samples covering topics such
line and also available for separate licensing.
as faceted searching and indexing databases, including
Over 25 Search Options. dtSearch products have more than SharePoint, NoSQL and SQL, along with BLOB data. Platforms
25 search features, including special forensics search options and include Windows and Linux (with separate native 64-bit builds
extensive international language support. The dtSearch Engine of both) as well as UWP, Mac and Android. The dtSearch
also has a range of relevancy-ranking options, including positive Engine also works on cloud platforms like Azure and AWS.

For hundreds of developer case studies and press review, and fully-functional
evaluations (including the search engine and the document filters), visit g dtSearch.com

Untitled-1 1 7/13/17 11:02 AM


“Reset the” and executing the “Reset
the Visual Studio 2017 Experimental
Instance” command.) If you go to the
Debug tab on the Properties page of
the project, you can set the Start exter-
nal program field to the Visual Studio
2017 devenv.exe file. (It’s important to
change this if upgrading, since it would
point to an old version of Visual
Studio.) You can also see that the
Command line arguments specify
“Exp” as the root suffix (see Figure 8),
so that the Experimental Instance is
also used for debugging.
Figure 8 Debug Experimental Instance Click the Debug | Start Debugging
menu entry and a new Visual Studio
Visual Studio installation), the best approach is to get the assem- instance will be launched (notice its caption indi­cates “Experimen-
blies from wherever and create a folder called VS2012Assemblies tal Instance”). If you click the Tools | Invoke MyCommand1 menu
under the root folder of the project. Then, copy the DLLs to that entry, the package will be loaded, the command will be executed
folder, reference them from there (using the Browse button of the and a message box will be shown.
project’s Reference Manager dialog) and add the VS2012Assemblies If you want to use Visual Studio 2017 to debug the extension on
folder to source code control, ensuring that the DLLs are added a previous Visual Studio version, you need to make two changes:
to it (normally source code control tools don’t add DLLs by First, because once the extension is built it’s deployed to the
default). So, from this point, the required Visual Studio assemblies Visual Studio Experimental Instance of the version whose SDK was
are part of the source code. used to build the project, you need to remove the NuGet package
Microsoft.VSSDK.BuildTools version 15.0 and use version 14.0
for Visual Studio 2015 or version 12.0 for Visual Studio 2013. For
As you probably know, many Visual Studio 2012 there isn’t a NuGet package for the VSDK, so
you need to edit the .csproj file and point the VSToolsPath vari-
Visual Studio projects support able to the location of the VSSDK 11.0 (C:\Program Files (x86)\
MSBuild\Microsoft\VisualStudio\v11.0), which you must install
“round-tripping”; that is, they separately. Second, you need to go to the Debug tab on the
Properties page of the project and set the Start external program
can be opened and debugged field to the matching Common7\IDE\devenv.exe executable.
As you probably know, many Visual Studio projects support
by several Visual Studio versions round-tripping. That is, they can be opened and debugged by
several Visual Studio versions without suffering modifications.
without suffering modifications. This is not the case with extensibility projects “out of the box.”
However, with some mastering of MSBuild and Visual Studio SDKs,
you may achieve it, but it’s always a tricky approach.
To follow the principle of not including assembly references in Once you’re done with the development and debugging, you can
the VSIX file and not even in the output folder, select each refer- build your extension in Release configuration and test it on Visual
ence and in the Properties window ensure that the Copy Local Studio versions installed in isolated instances on test machines. If
property is set to False. At this point the solution can be rebuilt everything goes well, you can then publish your extension on the
without errors. Using Windows Explorer, go to the output folder. Visual Studio Marketplace! n
Only these files should be generated: extension.vsixmanifest,
VSIX­ProjectVS2012_2017.dll, VSIXProjectVS2012_2017.pkgdef
and VSIXProjectVS2012_2017.vsix.
When you build the project, one of the MSBuild targets deploys Carlos Quintero has received the Microsoft Most Valuable Professional award 14
times, currently in the category of Visual Studio and Development Technologies.
the extension to the Experimental instance of Visual Studio. This He has been helping other developers to create extensions for Visual Studio since
is an instance of Visual Studio that uses different folders and 2002, blogging about it since 2006 at visualstudioextensibility.com and more
Registry entries than the normal instance, so that you don’t make recently tweeting about it: @VSExtensibility.
the normal instance unusable if something goes wrong with
your extension during development. (You can always reset the Thanks to the following technical experts for reviewing this article:
Experimental instance clicking the Windows Start button, typing Justin Clareburt, Alex Eyler and Mads Kristensen

16 msdn magazine Visual Studio

0817msdn_QuinteroExt_v3_8-16.indd 16 7/12/17 11:56 AM


File Format APIs
Working with Files?
CREATE CONVERT PRINT

FREE TRIAL
MODIFY COMBINE

Manipulate Word, Excel, PDF, PowerPoint, Outlook and more than 100 other CONTACT US
file formats in your applications without installing Microsoft Office.
US: +1 903 306 1676
DOC, XLS, PDF, PPT, MSG, BMP, PNG, XML and many more! EU: +44 141 628 8900
AU: +61 2 8006 6987
Platforms supported: .NET, Java, Cloud, Android, SharePoint, Reporting
Services, and JasperReports [email protected]

Try for FREE at


www.aspose.com

Untitled-8 1 1/5/17 2:05 PM


XAMARIN.FORMS

How Xamarin.Forms
Customization Took an
FAA Drone App Higher
Dan Hermes

More than 1 million drones are in the hands of recre- The beauty of this Xamarin.Forms implementation is just how
ational flyers. People are taking unprecedented video footage of much of it is truly cross-platform. Of the 25 screens in the app,
events, geography and nature with drone cameras. Commercial only one requires platform-specific customization. Many, if not
drone flyers are conducting inspections of structures and surveying most, mobile app requirements today include a cross-platform
land in a way that’s changing their industries. All these drones in mandate. If such an app plan is mostly data entry and display,
the air have become the concern of the Federal Aviation Adminis- standard navigation and UI, and minimal graphics and animation,
tration (FAA), which has responded with a strategy and a series of then it should be considered a strong candidate for development
new regulations aimed at helping flyers operate safely and legally. using Xamarin.Forms.
Of course, there’s an app for that. It’s called B4UFLY, and it’s
written in Xamarin.Forms (note: B4UFLY is used with permission What Is Xamarin.Forms?
from Network Designs and the FAA). Drawing upon FAA airport Xamarin.Forms is a library of cross-platform UI classes built
and special location data, the app provides flyers with an interac- atop Xamarin.Android and Xamarin.iOS that also binds directly
tive map and real-time status updates depending on their position to the native Universal Windows Platform (UWP), as shown in
or their planned flight. The status reflects the level of flight safety Figure 2. This provides a cross-platform set of UI components
and legality and helps the flyer find areas away from airports and that render in each of the three native OSes.
restricted airspace. The app, as shown in Figure 1, has been down- Xamarin.Forms provides a cross-platform library of pages,
loaded more than 300,000 times and is in its second year of updates. layouts, and controls and is a great place to begin building an
app quickly. There are two ways to create UIs in Xamarin.Forms:
either in C# using the rich Xamarin.Forms API or using Exten-
This article discusses:
sible Markup Language (XAML), a declarative markup language
• Xamarin.Forms customization
created by Microsoft.
• Cross-platform mobile development
• Dynamic layouts What Does the Xamarin.Forms Solution Look Like?
Technologies discussed: The B4UFLY solution contains four projects. The B4UFly project
Xamarin.Forms, Custom Renderer, Effects, Native View Declaration
contains the Xamarin.Forms markup and code. The b4ufly.Droid
project contains the Android-specific code and the b4ufly.iOS

18 msdn magazine

0817msdn_HermesFAA_v4_18-25.indd 18 7/12/17 11:52 AM


project is the iOS piece of the solution. B4UFly_UITEST contains <StackLayout x:Name="topStatusIconHolder" Orientation="Horizontal"
VerticalOptions="FillAndExpand" HorizontalOptions="StartAndExpand"
scripts for UI testing, which can be done on a local computer or, Padding="0, 5, 5, 0" BackgroundColor="White" >
ultimately, on Xamarin Test Cloud. <Image x:Name="topStatusIcon" Aspect="AspectFit" Source="Blank.png"
VerticalOptions="CenterAndExpand"
The Xamarin.Forms project, called B4UFly, contains cross-­ BackgroundColor="Transparent" HorizontalOptions="CenterAndExpand"
platform UI code written using XAML with C# codebehind and HeightRequest="50" WidthRequest="50" />
</StackLayout>
the Xamarin.Forms library. Cross-platform business logic and data
access code is housed in the UTILS folder. App.cs is the initializa- Depending on the user’s flight location, the status can change
tion file for the Xamarin.Forms app. to fly or no-fly. This example shows a no-fly situation and the text
Each platform-specific project has its own startup file for the and icon are updated to reflect the restriction:
if (safeToFlyResult.isInForbiddenZone == true)
respective OS. The Android project contains a startup file called {
MainActivity.cs, which defines an activity class inherited from topStatusTextHolder.BackgroundColor = Color.White;
topStatusText.Text = "Flight Prohibited";
Xamarin.Forms.Platform.Android.FormsApplicationActivity. topStatusText.IsVisible = true;
The iOS project contains a startup file called AppDelegate, which topStatusIcon.Source = ImageSource.FromFile("no_drone_zone.png");

inherits from Xamarin.Forms.Platform.iOS.FormsApplicationDelegate. Showing and hiding elements begins with a XAML layout, the
Once a Xamarin.Forms project is created, development of the “DO NOT FLY” status in this case:
UI can follow. <StackLayout x:Name="stackForbiddenToFly" Orientation="Vertical" IsVisible="false"
Padding="10, 20, 10, 5" VerticalOptions="Start">
<Label x:Name="forbiddenDoNotFlyText" Text="DO NOT FLY YOUR AIRCRAFT"
Dynamic Layouts TextColor="#DA4E5B"
FontSize="22" FontAttributes="Bold" HorizontalOptions="Center"
B4UFLY makes use of all of the standard Xamarin layouts, including HorizontalTextAlignment="Center" />
StackLayout, AbsoluteLayout and Grid. Xamarin.Forms Layouts </StackLayout>

can also be employed to create dynamic layouts with content that When the status is determined to be no-fly because a location
changes in real time. This isn’t about data binding, although that’s has been chosen where drone flight is prohibited, the StackLayout
possible, as well. This is about modifying the structure and appear- stackForbiddenToFly is made visible (as shown in Figure 3):
if (safeToFlyResult.isInForbiddenZone == true)
ance of the screens themselves. {
The two most important screens in the app are the map and the stackForbiddenToFly.IsVisible = true;
...
status page. The map is where the flyer’s GPS position is determined,
and where surrounding locations and flight restrictions and air- The final dynamic UI approach is the physical removal of ele-
ports are displayed. The map is also where a pin can be dropped, ments from a layout using C# code. Here’s an example of a layout and
in something called Planning Mode, so the flyer can determine if button being removed from a layout’s collection of child elements:
stackCurrentLocationTop.Children.Remove (refreshComboStack);
it’s safe to fly there. stackCurrentLocationTop.Children.Remove (dismissImgBtn);
The status page (Figure 3) tells the user if it’s Add a layout to a layout’s children:
safe to fly. There are three main statuses: yellow,  stackCurrentLocationTop.Children.Add
orange and red. (There’s no green because of (refreshComboStack, 3, 4, 0, 1);

lawyers.) Each of these statuses is reflected on Those are the three main approaches to
the status page by a different status icon, by the dynamic UI: modify existing layouts and
text in the header and the color of the header’s their elements, showing and hiding ele-
background, as well as the text that’s displayed ments, and adding and removing elements
on the page to explain the status. Even the ad- and layouts from layout collections using C#.
ditional info buttons at the bottom of the page Xamarin.Forms has become an increas-
can change. The entire status page is dynamic. ingly easier choice with the outstanding
Xamarin.Forms provides several ways support for Xamarin.Forms customization,
to change content in midstream, providing providing access to native UI features. A
dynamic content modifiable in real time. good rule of thumb is that you don’t want to
The first way is to modify existing layouts have to customize (by platform) more than
and their elements. The second is to show 20 percent to 30 percent of your app. More
and hide elements. The third way is to add than that and you should use a platform-
and remove elements from the page using specific option, such as Xamarin.Android or
C#. B4UFLY employs all three of these Xamarin.iOS. So what does it mean to
approaches in the status screen. customize a Xamarin.Forms app?
Modifying layouts begins with a layout
to modify, created using XAML in this case, Customizing Your App
though it could just as easily be created using Using Xamarin.Forms
C#. This example is a StackLayout containing Before Xamarin.Forms was released, I
a status bar at the top of the map containing Figure 1 B4UFLY Planning Mode Helps would code my mobile app’s cross-platform
a status icon called topStatusIcon: People Find Places to Fly Their Drones business logic and data layer in C#. I
msdnmagazine.com August 2017 19

0817msdn_HermesFAA_v4_18-25.indd 19 7/12/17 11:52 AM


to the native UI SDKs: iOS UIKit, Android SDK and Windows
Xamarin
10 SDK. You can create platform-specific views and pages in the
Xamarin.Forms platform-specific project anytime you need to use functionality in
native iOS, Android and Windows.
Xamarin.iOS Xamarin.Android Using the Xamarin.Forms built-in Dependency Injection, you
initialize and reference the custom UI class and Xamarin pulls it
out of the appropriate platform’s project for you. That’s how the
iOS UIKit Android SDK Windows 10 SDK map page was built in B4UFLY.
But what if you just want to change one or two properties or
Figure 2 Xamarin Libraries Bind to Native OS Libraries events and don’t need an entire custom UI class?
Enter Effects. Coding an entire UI renderer class for each platform
would then build my UIs with complete access to the underly- can be excessive. Sometimes all that’s needed is a tweak to a single
ing native SDKs, but I’d need to make UIs for each platform using control element, such as a drop shadow on a label. While custom
Xamarin.iOS, Xamarin.Android or Windows 10 SDK. renderers expose an entire platform-specific class, Effects exposes
So when it was first announced that with Xamarin.Forms just its properties. The entire element needn’t be subclassed, though
you could build your mobile UI only once and compile for iOS, a platform-specific class is necessary. A Xamarin.Forms effect offers
Android and the UWP, my heart skipped a beat. That’s because it’s this precision approach to platform-­specific UI customization.
what I always longed for: an end-to-end cross-platform develop- What if all you really need is a native control on your
ment experience. Xamarin.Forms layout?
However, I knew just how deep Xamarin already went when it Take the plunge and declare a platform-specific control, some-
came to native UI and I wondered: “What if I need something that times called a “native control,” though it’s a Xamarin control and not
Xamarin.Forms can’t do?” truly native. Instead of coding overly customized custom renderers,
I asked everyone I knew to explain exactly what Xamarin.Forms declare native views from Xamarin.iOS, Xamarin.Android or the
could do and what it couldn’t do, and I received many terrific UWP directly into your Xamarin.Forms layouts. Using a shared
responses that helped me better understand Xamarin.Forms, but project and conditional compilation, include platform-specific UI
no one could really answer my question. So, I wrote a book to libraries in your C# UI classes where you can reference them as
answer it: “Xamarin Mobile Application Development” (Apress, directly as if you were coding in the native platform. Set properties
2015). And here’s a spoiler: Use custom renderers. and event handlers on these views and use them side-by-side with
Custom renderers give you the ability to punch down through Xamarin.Forms views, in both C# and XAML.
the Xamarin.Forms abstraction and gain direct access to Xamarin.Forms development gives you the ease of cross-plat-
Xamarin.Android, Xamarin.iOS and the UWP. This means access form development using C# and a single UI library with a solid

Figure 3 B4UFLY Status Page in a Figure 4 Map Page at Lexicon Systems Figure 5 B4UFLY Planning Mode Page
No-Fly Area Office in Beverly, Mass. in San Francisco, Calif.
20 msdn magazine Xamarin.Forms

0817msdn_HermesFAA_v4_18-25.indd 20 7/12/17 11:52 AM


magazine VP MSDN MAGAZINE VENDOR PROFILE

JetBrains Rider:
New Cross-Platform .NET IDE
A Q&A with Kirill Skrygan,
Rider Team Lead at JetBrains

Q What is JetBrains Rider? Q Is Rider going to support everything ReSharper supports?


A Rider is our new stand-alone cross-platform IDE for .NET A Right now Rider has 90% of ReSharper features. Diagrams, call
development. It is built on top of IntelliJ platform and incorporates tracking and hierarchy tool windows will ultimately be supported,
features of our well-known Visual Studio extension, ReSharper. too. Moreover, both IntelliJ and ReSharper plugins can be used
Rider is a smart, powerful, yet very fast and smooth IDE. with Rider as well.

Rider provides .NET developers with smart code completion,


highlighting, search, navigation, code inspections, quick-fixes, and
Q Why should Visual Studio users consider trying Rider?
refactorings. IntelliJ platform brings debugger, excellent VCS A If we’re talking about pure Visual Studio experience (without
integration, local history, building, and many other features that ReSharper), the reasons are fairly obvious: Rider will give you
help developers be productive. what Visual Studio simply cannot: hundreds of code inspections
and fixes, small and large refactorings, smart navigation and
We are doing our best to make Rider fast, because it’s one of the
code generation.
key aspects that affect developer happiness. Rider uses our new
technology that runs features like code indexing or analysis in a Compared to Visual Studio with ReSharper installed, Rider still gives
process that is completely separated from UI, which allows Rider you strong reasons to consider it. It’s faster (according to feedback
to be a very powerful, yet smooth and responsive IDE, that lets we receive from our users) and because it uses 64-bit architecture,
you develop .NET applications on Mac or Linux, not just Windows. Rider can work with very complex or large solutions that are
beyond 32-bit capabilities of current Visual Studio. And don’t forget
Q What platforms and technologies does Rider support? that Rider can work on any modern OS, not just Windows.

A The majority of modern .NET technologies


is supported, including ASP.NET and ASP.NET
Core web applications, as well as desktop
.NET applications.

We support Unity via bundled plugin that


brings Unity-specific features to Rider.
Xamarin is supported on both iOS/Android
platforms, so Rider can load, build, run and
debug Xamarin applications.

Rider embeds IntelliJ components to better


support different .NET-related technologies.
For example, it uses amazing features from
WebStorm to support all kinds of web
technologies, and DataGrip functionality
to work with databases and SQL files.

To learn more about Rider and download


www.jetbrains.com/rider
a free 30-day trial, please visit g

Untitled-4 1 7/12/17 3:48 PM


foundation of customization options for those times when you iOS App Android App Windows 10 App
really need native features. Use custom renderers to build platform-­
specific UI classes using Xamarin.iOS, Xamarin.Android and the
UWP. Use effects to access platform-specific properties. And when Cross-Platform
Shared C# Code and Markup Using Xamarin.Forms
UI Layer
you can’t do without the real thing, declare a native view in your
Xamarin.Forms layout.
Platform-Specific
UI Layer C# Code Using C# Code Using C# Code Using
Custom Renderers—B4UFLY Map Using Custom Xamarin.iOS Xamarin.Android Windows 10 SDK
Renderers
The B4UfLY map page is the only page of more than 25 in
the app that requires customization. That ratio of 25:1 generic
Core Library
Xamarin.Forms pages to customized page makes this app a strong Business Logic Shared C# Code
case study for Xamarin.Forms. and Data Layer

The map uses your current location and provides immediate


surrounding flight restrictions and warnings, as shown in Figure 4. Figure 6 Xamarin.Forms UI
A variation on the map page is Planning Mode, which permits
the dropping of a pin to determine the restrictions and flight sta- facilities and the like. The renderer draws both icons and surround-
tuses of hypothetical locations, as shown in Figure 5. Note the ing colored areas denoting the important airspace. These types of
icon in the upper left indicating “no-fly” due to a nearby controlled graphics are handled slightly differently in iOS than in Android. The
airspace (the “C” icon). Android map renderer uses a similar approach to the one used for iOS:
Xamarin.Forms binds to only a fraction of the features available [assembly: ExportRenderer (typeof(MyMap), typeof(MyMapRenderer))]
namespace b4ufly.Droid
in the complete platform-specific UI libraries (iOS Webkit, Android {
SDK and Windows 10 SDK). Fortunately, Xamarin.Forms exposes
public class MyMapRenderer : MapRenderer, MapExtension,
the mechanism whereby cross-platform views are converted into GoogleMap.IOnCameraChangeListener, GoogleMap.IOnMarkerDragListener,
platform-specific views. This mechanism is called rendering. By GoogleMap.IOnMarkerClickListener
{
creating your own custom renderers, you get full access to platform-
Once you create the renderers, it’s time to use them. Based on the
specific features buried deep within each view.
MyMap data type, which uses the MyMapRenderer, the following
Custom renderers are a bridge between Xamarin.Forms and
statement instantiates a platform-specific map:
Xamarin platform-specific libraries, Xamarin.iOS, Xamarin.An-
map = new MyMap(MapSpan.FromCenterAndRadius(new Position(0, 0), Distance.
droid and Windows 10 SDK. Think of a custom renderer as a way FromMiles(1.0)))
to access and extend the binding between Xamarin.Forms and the The built-in Inversion of Control (IoC) mechanism in Xama-
platform-specific elements. rin.Forms uses the renderer from the platform project currently
Project requirements call for features not possible with the out- being built. By adding platform-specific map references, you could
of-the-box Xamarin.Forms.Maps library, including the placement explicitly instantiate an Apple Mapkit in the iOS renderer and a
of icons and colored areas around each icon to delimit certain air- Google Map in the Android renderer.
spaces on the map. Custom rendering to the rescue! Beginning
with MapPage, created by inheriting ContentPage, you can create Figure 7 iOS implementation of DropShadowEffectLabel
a foundational class, which you can use to customize its renderer
[assembly:ResolutionGroupName ("FAA")]
for each platform, letting you code custom graphics separately for [assembly:ExportEffect (typeof(DropShadowEffectLabel), "DropShadowEffectLabel")]
iOS and Android: namespace b4ufly.iOS
namespace b4ufly.iOS {
{ public class DropShadowEffectLabel : PlatformEffect
public partial class MapPage : ContentPage {
{ protected override void OnAttached ()
public static MapPage me = null; {
public static MyMap map = null; try {
public static Boolean plannerModeOn = false; var effect =
(DropShadowEffect)Element.Effects.FirstOrDefault
Once you have a custom element, MapPage, then you need to (e => e is DropShadowEffect);
create the custom renderers for each platform, iOS and Android if (effect != null) {
Control.Layer.ShadowColor = effect.Color.ToCGColor();
in B4UFLY, although you can also do this for UWP. Renderers Control.Layer.CornerRadius = 5;
realize a view on the native platform. You create your own ren- Control.Layer.ShadowOffset = new CGSize (5, 5);
Control.Layer.ShadowOpacity = 1.0f; }
derer by inheriting from the standard MapRenderer, beginning } catch (Exception ex)
with iOS: {
Console.WriteLine ("Cannot set effect property. Error: ", ex.Message);
[assembly:ExportRenderer (typeof(MyMap), typeof(MyMapRenderer))] }
namespace b4ufly.iOS }
{
public class MyMapRenderer : MapRenderer, MapExtension protected override void OnDetached ()
{ {
MyMapRenderer draws the locations on the map that drone }
}
flyers need to be aware of: airports, controlled airspace, military
22 msdn magazine Xamarin.Forms

0817msdn_HermesFAA_v4_18-25.indd 22 7/12/17 11:52 AM


Data Quality Made Easy.
Your Data, Your Way.
NAME

Melissa provides the full spectrum of data Our data quality solutions are available
quality to ensure you have data you can trust. on-premises and in the Cloud – fast,
easy to use, and powerful developer
We profile, standardize, verify, match and
tools and plugins for the Microsoft®
enrich global People Data – name, address,
Product Ecosystem.
email, phone, and more.

Start Your Free Trial


www.Melissa.com/msft-pd

Melissa Data is Now Melissa.


Why the change?
See for Yourself at the New www.Melissa.com 1-800-MELISSA

Untitled-5 1 3/10/17 1:19 PM


Customization of Xamarin.Forms elements leads you to a different from PlatformEffect and needs an ExportEffect, which registers the
view of the solution architecture, with custom renderers residing in effect with Xamarin.Forms. Figure 7 shows an implementation
the middle platform-specific UI layer, as shown in Figure 6. on iOS in the Xamarin.iOS project.
Custom renderers are powerful and thorough in their imple- Control is an iOS UIView.PlatformEffect that exposes these
mentation as platform-specific enablers of Xamarin.Forms UI methods and which must be overridden:
elements. Custom renderers are, however, heavy artillery. If you •O nAttached—customize the control here
want something more tactical, like merely customizing a property • OnDetached—perform cleanup (for example, deregister events)
on a Xamarin.Forms control, consider an “effect.” Next is the Android implementation, similar to the iOS effect,
except that the label control is the Android-specific TextView, as
Effects shown in Figure 8. The TextView control is typed explicitly to
Effects provide access to individual platform-specific properties access the Set­ShadowLayer method.
of controls and can be parameterized. To create an effect, first cre- Once the effect is in place, it’s time to invoke it. First, a control
ate a class that is a subclass of the RoutingEffect class. Mind the needs to be declared in XAML or C#. You then attach the effect
method overrides and attributes. Then use the effect in your app. to the control by adding it to the control’s effects collection. The
In addition to exposing properties, effects also have the capac- following example shows the XAML approach with an Entry con-
ity to pass parameters to those properties and define events on trol declared in XAML with the DropShadowEffect added to the
Xamarin.Forms controls. You pass parameters to the effect using control’s Effects collection and the Color property set to black:
Attached Properties or the Common Language Runtime (CLR). <Label Text="Label with Shadow" ... >
<Label.Effects>
The following example uses the CLR to bind properties to the effect <local:DropShadowEffect Color="Black">
and creates the DropShadowEffect in the Xamarin.Forms project: </local:DropShadowEffect>
</Label.Effects>
public class DropShadowEffect : RoutingEffect
</Label>
{
public Color Color { get; set; } Using C# instead of XAML, the label with attached effect can
public DropShadowEffect () : base ("FAA.DropShadowEffectLabel")
be created, as shown here:
{ var label = new Label {
} Text = "Label with Shadow",
} ...
};
This label effect provides a color property for the shadow and label.Effects.Add (new DropShadowEffect {
references the platform-specific implementation of the Drop­ Color = Color.Black,
});
ShadowEffectLabel in its base class.
Tactical customization using effects lets you make specific chang-
You implement the effect in a platform-specific project, simi-
es to the Xamarin.Forms controls, but sometimes changing certain
larly to a custom renderer, although implementation is optional
properties and methods just isn’t enough. When you want to use
in each platform. Once per project, you add a ResolutionGroup-
a lot of features of a native control, then you wind up doing a lot
Name attribute containing your company name to avoid collisions
of custom effects coding.
with other effects of the same name. Each Effect class is subclassed

Figure 8 Android Implementation of DropShadowEffectLabel Native View Declaration


Sometimes you want complete control of the UI. Thankfully there’s
[assembly:ResolutionGroupName ("FAA")]
[assembly:ExportEffect (typeof(DropShadowEffectLabel), "DropShadowEffectLabel")] now a way to get this in Xamarin.Forms via native view declaration.
namespace b4ufly.Droid Declared native controls are incredibly powerful, but are not without
{
public class DropShadowEffectLabel : PlatformEffect limitations. They’re easiest to use in XAML, secondarily in C# using
{ a Shared Project (which is called native embedding), though it’s pos-
protected override void OnAttached () sible but not easy or recommended to use them in a Portable Class
{
try {
Library (PCL). A lot of projects use PCLs and that often means native
var control = Control as Android.Widget.TextView; views are best used in XAML, and that’s the approach I’ll cover here.
var effect =
(DropShadowEffect)Element.Effects.FirstOrDefault
There are two steps in declaring a native view in XAML. First,
(e => e is DropShadowEffect); specify the namespace for each native source. Second, declare the
if (effect != null) {
Android.Graphics.Color color = effect.Color.ToAndroid ();
native view. Figure 9 shows an example, using the label control. It
control.SetShadowLayer (5, 5, 5, color); begins with the basic XAML page and defines the namespaces for
// params: radius, offsetX, offsetY, color
}
iOS, Android and Windows (shown in bold code).
} catch (Exception ex) { Next, native views are declared in the Content property of the
Console.WriteLine ("Cannot set effect property. Error: ", ex.Message);
}
ContentPage. A UILabel for iOS, a TextView for Android and a
} TextBlock for Windows:
<ContentPage.Content>
protected override void OnDetached ()
<ios:UILabel Text="This is an iOS UILabel" View.HorizontalOptions="Start"/>
{
<androidWidget:TextView Text="This is an Android TextView"
}
x:Arguments="{x:Static formsandroid:Forms.Context}" />
}
<win:TextBlock Text="This is a Windows TextBlock"/>
}
</ContentPage.Content>

24 msdn magazine Xamarin.Forms

0817msdn_HermesFAA_v4_18-25.indd 24 7/12/17 11:52 AM


®
Figure 9 Native Control Namespace Declarations
<?xml version="1.0" encoding="utf-8"?>
<ContentPage xmlns="http://xamarin.com/schemas/2014/forms"

Instantly Search
xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"
xmlns:ios="clr-namespace:UIKit;assembly=Xamarin.iOS;targetPlatform=iOS"
xmlns:androidWidget="clr-namespace:Android.Widget;assembly=
Mono.Android;targetPlatform=Android"

Terabytes of Data
xmlns:formsandroid="clr-namespace:Xamarin.Forms;assembly=
Xamarin.Forms.Platform.Android;targetPlatform=Android"
xmlns:win="clr-namespace:Windows.UI.Xaml.Controls;assembly=Windows,
Version=255.255.255.255, Culture=neutral, PublicKeyToken=null,
ContentType=WindowsRuntime;targetPlatform=Windows"
x:Class="b4ufly.NativeView" > across a desktop, network, Internet or
<ContentPage.Content>
</ContentPage.Content> Intranet site with dtSearch enterprise
</ContentPage>
and developer products
Those are the three approaches to Xamarin.Forms custom-
ization: custom renderers, effects and native view declaration.
Custom renderer is a heavyweight option offering a lot of flexi-
bility, while effects provides a surgical approach to customization.
Native view declaration is the nuclear option, circumventing Over 25 search features, with easy
Xamarin.Forms entirely.
multicolor hit-highlighting options
Wrapping Up
You’ll eventually need more from Xamarin.Forms than it gives you
out-of-the-box, just like I did with B4UFLY. When complex tasks
or designs are required by Xamarin.Forms, virtually anything is dtSearch’s document filters support
possible using Xamarin.Forms customization. Customization pro- popular file types, emails with multilevel
vides access to the lower-level, platform-specific, screen-rendering attachments, databases, web data
classes called “renderers,” which use platform-specific controls
to create all Xamarin.Forms screens. Any Xamarin.Forms screen
can be broken into platform-specific screens, classes, controls and
properties using this approach.
A lighter-weight approach is to use effects to access platform-
Developers:
specific properties and events. You can also use entire native controls • APIs for .NET, Java and C++
on your Xamarin.Forms pages using native view declaration.
This means that you can write a Xamarin.Forms page or app • SDKs for Windows, UWP, Linux,
and customize it by platform. Use customization sparingly, or Mac and Android
risk a fragmented UI code base that probably should have been
written entirely as a platform-specific UI. Used judiciously, cus-
• See dtSearch.com for articles on
tomization can turn your basic, lackluster product into a versatile, faceted search, advanced data
unique, popular app. classification, working with SQL,
In B4UFLY, the FAA’s investment in Xamarin.Forms continues NoSQL & other DBs, MS Azure, etc.
to pay off because of the many ongoing enhancements that are
generic to the many cross-platform text-based pages. The platform-­
specific map page contains some cross-platform elements, but
much of that page requires platform-specific customization. This
Xamarin.Forms architecture is extensible and development times
and costs are lower because of it; the significant code reuse is Visit dtSearch.com for
practical and elegant. n
• hundreds of reviews and case studies
Dan Hermes is a Xamarin MVP, a Microsoft MVP, and author of “Xamarin • fully-functional evaluations
Mobile Application Development.” He is principal of Lexicon Systems, a Boston-­
based consultancy building award-winning mobile apps and helping companies
build their own successful apps. Follow his blog at mobilecsharpcafe.com, on Twitter: The Smart Choice for Text Retrieval®
@danhermes or contact him at [email protected].
since 1991
Thanks to the following technical expert for reviewing this article:
Jesse Liberty dtSearch.com 1-800-IT-FINDS
msdnmagazine.com

0817msdn_HermesFAA_v4_18-25.indd 25 7/12/17 11:52 AM


DEVOPS

Git Internals: Architecture


and Index Files
Jonathan Waldman

In my last article (msdn.com/magazine/mt809117), I showed how IDE can sometimes lead to confusion. For example, ponder the
Git uses a directed acyclic graph (DAG) to organize a repo’s com­ basic workflow of adding a project to Git source control, modify­
mit objects. I also explored the blob, tree and tag objects to which ing project files, staging them and then committing the staged files.
commit objects can refer. I concluded the article with an intro­ To do that, you open the Team Explorer Changes pane to view the
duction to branching, including the distinction between HEAD list of changed files and then you select the ones you want to stage.
and head. That article is a prerequisite to this one, in which I’ll dis­ Consider the leftmost image in Figure 1, which shows that I changed
cuss the Git “three-tree” architecture and the importance of its two files in the working directory (Marker 1).
index file. Understanding these additional Git internals will build on In the next image to the right, I staged one of those changed files:
the foundational knowledge that will make you a more effective Git Program.cs (Marker 2). When I did that, Program.cs appears to
user and will provide new insights as you explore various Git oper­ have “moved” from the Changes list to the Staged Changes list. If
ations fronted by the graphical Git tooling in the Visual Studio IDE. I further modify and then save the working directory’s copy of
Recall from the last article that Visual Studio communicates with Git Program.cs, it continues to appear in the Staged Changes section
using a Git API, and that the Visual Studio IDE Git tooling abstracts (Marker 3)—but it also appears in the Changes section (Marker
away the complexity and capabilities of the underlying Git engine. 4)! Without understanding what Git is doing behind the scenes,
That’s a boon for developers who want to implement a version-con­ you might be flummoxed until you figured out that two “copies”
trol workflow without needing to rely on the Git command-line of Program.cs exist: one in the working folder and one in the Git
interface (CLI). Alas, the otherwise helpful Git abstractions of the internal database of objects. Even if you realize that, you might not
have any insight as to what would happen when you unstage the
staged copy, try to stage the second changed copy of Program.cs,
This article discusses:
undo changes to the working copy or switch branches.
• Git’s three-tree architecture
To truly grasp what Git is doing as you stage, unstage, undo, commit
• How the Git index works and check out files, you first must understand how Git is architected.
• Index extensions
Technologies discussed: The Git Three-Tree Architecture
Visual Studio 2017, Git for Windows 2.10
Git implements a three-tree architecture (a “tree” in this context
refers to a directory structure and files). Working from left to right

26 msdn magazine

0817msdn_WaldmanGit_v3_26-33.indd 26 7/12/17 11:57 AM


Git adds helpful metadata to the
items it stores in the index and in
commit objects. For example, the
metadata it stores in the index
helps it detect changes to files
in the working directory, while
the metadata it stores in commit
objects helps it track who issued
the commit and for what reason.
To review the three trees in
the three-tree architecture and to
put some perspective around the
remainder of this article’s focus: You
Figure 1 The Team Explorer Changes Pane Can Show the Same File in Its Changes already know how the working-­
and Staged Changes Sections directory tree functions, because
in Figure 2, the first tree is the collection of files and folders in the it’s actually the OS file system you’re already well-versed in using.
working directory—the OS directory that contains the hidden And if you read my earlier article, you should have good work­
.git folder; the second tree is typically stored in a single binary file ing knowledge of the DAG. Thus, at this point, the missing link is
called index, located in the root of the .git folder; the third tree the index tree (hereafter, “the index”) that straddles the working
is composed of Git objects that represent the DAG (recall that directory and the DAG. In fact, the index plays such an important
SHA-1-named Git objects are located in two-hex-digit-named role that it’s the sole subject of the remainder of this article.
folders .git\objects and can also be stored in “pack” files located
in .git\objects\pack and in file paths defined by the .git\objects\ How the Index Works
info\alternates file). Keep in mind that the Git repo is defined by You might have heard the friendly advice that the index is syn­
all files that sit in the .git folder. Often, people refer to the DAG as onymous with the “staging area.” While that’s somewhat accurate,
the Git repo, and that’s not quite accurate: The index and the DAG to speak of it that way belies its true role, which is not only to
are both contained in the Git repo. support a staging area, but also to facilitate the ability of Git to detect
Notice that while each tree stores a directory structure and files, changes to files in your working directory; to mediate the branch-
each leverages different data structures in order to retain tree-­ merge process, so you can resolve conflicts on a file-by-file basis
specific metadata and to optimize storage and retrieval. The first tree and safely abort the merge at any time; and to convert staged files
(the working directory tree, also called “the working tree”) is plainly and folders into tree objects whose references are written to the
the OS files and folders (no special data structures there, other than next commit object. Git also uses the index to retain information
what’s at the OS level) and serves the needs of the software devel­ about files in the working tree and about objects retrieved from
oper and Visual Studio; the second tree (the Git index) straddles the DAG—and thus further leveraging the index as a type of cache.
the working directory and the commit objects that form the DAG, Let’s investigate the index more thoroughly.
thereby helping Git perform speedy working-directory file-content The index implements its own self-contained file system, giving
comparisons and quick commits; the third tree (the DAG) makes it the ability to store references to folders and files along with meta­
it possible for Git to track a history of commits, as discussed in the data about them. How and when Git updates this index depends on
previous article. In its capacity as a robust version control system, the kind of Git command issued and the command options spec­
ified (if you’re so inclined, you can even use the Git update-index
plumbing command to manage the index yourself), so exhaustive
The OS File System coverage here isn’t possible. However, as you work with the Visual
Studio Git tooling, it’s helpful to be aware of the primary ways in
Git Repo
which Git updates the index and in which Git uses information
stored in the index. Figure 3 shows that Git updates the index with
working directory data when you stage a file, and it updates the
Working Directory Index DAG
index with DAG data when you initiate a merge (if there are merge
conflicts), clone or pull, or switch branches. On the other hand, Git
Where Stored Where Stored Where Stored
relies on information stored in the index when it updates the DAG
folder containing .git .git\index .git\objects after you issue a commit, and when it updates the working directory
OS File System Single Binary File Compressed Binary Files
after you clone or pull, or after you switch branches. Once you
realize that Git relies on the index and that the index straddles so
many Git operations, you’ll begin to appreciate the advanced Git
Figure 2 The Git Three-Tree Architecture Leverages the All- commands that modify the index, effectively empowering you to
Important Index File for Its Smart and Efficient Performance finesse how Git operates.
msdnmagazine.com August 2017 27

0817msdn_WaldmanGit_v3_26-33.indd 27 7/12/17 11:57 AM


locates the blob object in the .git\objects folder and updates its
Stage (Add) Commit
date-modified time (Git will never overwrite objects that already
exist in the repo; it updates the last-modified date so as to delay this
newly added object from being considered for garbage collection).
Merge Otherwise, it uses the first two characters of the SHA-1 string as the
Working Directory Index DAG directory name in .git\objects and the remaining 38 characters to
name the blob file before zlib-compressing it and writing its con­
Clone Clone tents. In my example, Git would create a folder in .git\objects called
5a and then write the blob object into that folder as a file with the
name b2f8a4323abafb10abb68657d9d39f1a775057.
Pull Pull
When Git creates a blob object in this manner, you might be
surprised that one expected file property is conspicuously missing
Switch Branch (Check Out) Switch Branch (Check Out) from the blob object: the file name! That’s by design, however. Recall
that Git is a content-addressable file system and, as such, it manages
Figure 3 Primary Git Actions That Update the Index (Green) SHA-1-named blob objects—not files. Each blob object is normally
and Git Actions That Rely on What the Index Contains (Red) referenced by at least one tree object, and tree objects in turn are
normally referenced by commit objects. Ultimately, Git tree objects
Let’s create a new file in the working directory to see what hap­ express the folder structure of the files you stage. But Git doesn’t cre­
pens to it as it’s written to the index. As soon as you stage that file, ate those tree objects until you issue a commit. Therefore, you can
Git creates a header using this string-concatenation formula: conclude that if Git uses only the index to prepare a commit object, it
blob{space}{file-length in bytes}{null-termination character}
also must capture the file-path references for each blob in the index—
Git then concatenates the header to the beginning of the file con­
and that’s exactly what it does. In fact, even if two blobs have the same
tents. Thus, for a text file containing the string “Hello,” the header
SHA-1 value, as long as each maps to a different file name or differ­
+ file contents would generate a string that looks like this (keep in
ent path/file value, each will appear as a separate entry in the index.
mind there’s a null character before the letter “H”):
blob 5Hello
Git also saves file metadata with each blob object it writes to the
To see that more clearly, here’s the hexadecimal version of that string: index, such as the file’s create and modified dates. Git leverages this
62 6C 6F 62 20 35 00 48 65 6C 6C 6F information to efficiently detect changes to files in your working
Git then computes an SHA-1 for the string: directory using file-date comparisons and heuristics rather than
5ab2f8a4323abafb10abb68657d9d39f1a775057 brute-force re-computing the SHA-1 values for each file in the
Git next inspects the existing index to determine if an entry for working directory. Such a strategy speeds up the information you
that folder\file name already exists with the same SHA-1. If so, it see in the Team Explorer Changes pane—or when you issue the
porcelain Git status command.
Once armed with an index entry
for a working-directory file along
with its associated metadata, Git is
said to “track” the file because it can
readily compare its copy of the file
with the copy that remains in the
working directory. Technically, a
tracked file is one that also exists in
the working directory and is to be
included in the next commit. This
is in contrast to untracked files, of
which there are two types: files that
are in the working directory but not
in the index, and files that are explic­
itly designated as not to be tracked
(see the Index Extensions section).
To summarize, the index gives Git
the power to determine which files
are tracked, which are not tracked,
and which should not be tracked.
To better understand the spe­
Figure 4 Viewing the History in Order to See What Visual Studio Does When You Create a cific contents of the index, let’s
New Project use a concrete example by starting

28 msdn magazine DevOps

0817msdn_WaldmanGit_v3_26-33.indd 28 7/12/17 11:57 AM


magazine VP MSDN MAGAZINE VENDOR PROFILE
MSDN Magazine Vendor Profile

Why Choose LEADTOOLS?

Award-winning Document, Medical, Imaging Component Experts


and Multimedia Imaging Technology LEAD’s developers are imaging experts as well as development
LEADTOOLS is a family of comprehensive toolkits designed to help component experts. Consistently receiving industry recognition
programmers integrate raster, document, medical, multimedia, and from the developer community, LEADTOOLS is designed to allow
vector imaging into their desktop, server, tablet, and mobile customers to easily integrate its technology into real-world solutions.
applications. LEADTOOLS gives developers the most flexible and
powerful imaging technology, offering development support for Reduce Time-to-Market
OCR, Barcode, Forms Recognition, PDF, Document Conversion Decrease solution costs and time-to-revenue by leveraging
and Viewing, Document Cleanup, Annotations, DICOM, PACS, LEADTOOLS imaging libraries and components. Develop your
HL7, Audio/Video Codecs, MPEG-2 Transport, DVR, Streaming, customers’ solutions with less overhead and a faster rollout.
File Formats (150+), Image Compression, Image Processing, Color Deliver sooner and get paid quicker.
Conversion, Viewers, Special Effects, Scanning/Capture, Printing,
and more. A LEADTOOLS toolkit literally puts millions of lines of Plug In and Go
code at the fingertips of application developers. Quickly image-enable existing solutions with minimal code changes.
LEADTOOLS provides many customizable, ready-built components
One-Stop Shop including document and medical web viewers, file converters,
LEAD supports the full range of imaging categories allowing recognition engines, and codecs.
customers to standardize on LEADTOOLS for all imaging require-
ments and eliminate multiple-vendor headaches. Enjoy the Outshine Your Competitors
simplicity of one vendor relationship, one license agreement, and Increase your company’s competitive advantage by adding
one support contact. With access to the full suite of LEADTOOLS significant functionality to your offering without substantially
SDKs developers stand to gain Medical Imaging, Document affecting cost or time-to-deliver.
Imaging, Recognition, Vector, Multimedia, and general Imaging
components with cross-platform capability. Experience that Matters
Founded in 1990, LEAD has a track record of more than 27 years of
Free Technical Support profitable operations and excellent customer service. LEAD is currently
LEAD offers free and unlimited technical support via email, user shipping version 19 of its core products. If your customer has an imaging
forums, and live chat; LEADTOOLS customers receive this free requirement, chances are LEAD has already coded, tested, and deployed
technical support in perpetuity. We also offer premium support it. Increase the productivity of your employees by allowing them to
options on a per case basis. leverage upon LEAD’s years of development blood, sweat, and tears.

Download a FREE, 60-day Evaluation SDK at g www.leadtools.com

Untitled-3 1 7/7/17 12:30 PM


version of the Git status command.
Currently, this window indicates
there are no unstaged changes in
the working directory. The way
Git makes this determination is
to compare each index entry with
each working directory file. With
the index’s file entries and associ­
ated file metadata, Git has all the
information it needs to determine
whether you’ve made any changes,
additions, deletions, or if you
renamed any files in the working
directory (excluding any files men­
tioned in the .gitignore file).
So the index plays a key role in
making Git smart about differences
between your working directory
tree and the commit object pointed
to by HEAD. To learn a bit more
about what kind of information the
index provides to the Git engine,
Figure 5 A Hex Dump of the Git Index File for the Project
go to the command-line window
with a new Visual Studio project. The complexity of this project isn’t you opened earlier and issue the following plumbing command:
so important—you just need a couple of files to adequately illustrate git ls-files --stage

what’s going on. Create a new console application called MSDNCon­ You can issue this command at any time to generate a complete
soleApp and check the Create directory for solution and the Create list of files currently in the index. On my system, this produces
new Git repository checkboxes. Click OK to create the solution. the following output:
I’ll issue some Git commands in a moment, so if you want to run 100644 1ff0c423042b46cb1d617b81efb715defbe8054d 0 .gitattributes
100644 3c4efe206bd0e7230ad0ae8396a3c883c8207906 0 .gitignore
them on your system, open a command prompt window in the 100644 f18cc2fac0bc0e4aa9c5e8655ed63fa33563ab1d 0 MSDNConsoleApp.sln
working directory and keep that window within reach as you fol­ 100644 88fa4027bda397de6bf19f0940e5dd6026c877f9 0 MSDNConsoleApp/App.config
100644 d837dc8996b727d6f6d2c4e788dc9857b840148a 0 MSDNConsoleApp/
low along. One way to quickly open a Git command window for a MSDNConsoleApp.csproj
particular Git repo is to access the Visual Studio Team menu and 100644 27e0d58c613432852eab6b9e693d67e5c6d7aba7 0 MSDNConsoleApp/Program.cs
100644 785cfad3244d5e16842f4cf8313c8a75e64adc38 0 MSDNConsoleApp/Properties/
select Manage Connections. You’ll see a list of local Git repositories, AssemblyInfo.cs
along with the path to that repo’s working directory. Right-click the The first column of output is a Unix OS file mode, in octal. Git
repo name and select Open Command Prompt to launch a window doesn’t support the full range of file-mode values, however. You’re
into which you can enter Git CLI commands. likely to only ever see 100644 (for non-EXE files) and 100755 (for
Once you create the solution, open the Team Explorer Branches Unix-based EXE files—Git for Windows also uses 100644 for
pane (Figure 4, Marker 1) to see that Git created a default branch executable file types). The second column is the SHA-1 value for
called master (Marker 2). Right-click the master branch (Marker the file. The third column represents the merge stage value for
2) and select View History (Marker 3) to view the two commits the file—0 for no conflict or 1, 2 or 3 when a merge conflict exists.
Visual Studio created on your behalf (Marker 4). The first has the Finally, notice that the path and file name for each of the seven blob
commit message “Add .gitignore and .gitattributes”; the second has objects are stored in the index. Git uses the path value when it builds
the commit message “Add project files.” tree objects ahead of the next commit (more on that in a moment).
Open the Team Explorer Changes pane. Visual Studio relies on Now, let’s examine the index file itself. Because it’s a binary file,
the Git API to populate items in this window—it’s the Visual Studio I’m going to use HexEdit 4 (a freeware hex editor available at
hexedit.com) to view its contents (Figure 5 shows an excerpt).
Figure 6 The Git Index Header Data Format
The first 12 bytes of the index contain the header (see Figure 6).
Index File - Header Entry The first 4 bytes will always contain the characters DIRC (short for
00 - 03 DIRC Fixed header for a directory cache entry. directory cache)—this is one reason the Git index is often referred
(4 bytes) All index files begin with this entry. to as the cache. The next 4 bytes contain the index version num­
04 - 07 Version Index version number (Git for Windows ber, which defaults to 2 unless you’re using certain features of Git
(4 bytes) currently uses version 2). (such as sparse checkout), in which case it might be set to version 3
08 - 11 Number of entries As a 4-byte value, the index supports up or 4. The final 4 bytes contain the number of file entries contained
(4 bytes) to 4,294,967,296 entries! further down in the index.
30 msdn magazine DevOps

0817msdn_WaldmanGit_v3_26-33.indd 30 7/12/17 11:57 AM


magazine VP MSDN MAGAZINE VENDOR PROFILE
MSDN Magazine Vendor Profile

Develop Collaborative Web


Applications with DocuVieware
Q&A with Loïc Carrère, owner, and CEO of ORPALIS,
and creator of GdPicture.NET and DocuVieware
Q You have just released the version 3 of DocuVieware, can
you tell us more about this SDK?

A DocuVieware is an HTML5 viewer and document management


kit to build web applications. It can be effortlessly
integrated into applications based on different web
technologies, like angular2, Node.js, php, asp.net mvc/
core, etc., with the help of Web Service Architecture.

DocuVieware operates seamlessly on any device and


platform, including mobile phones and tablets.

We help developers creating dynamic web applications


with user interaction. All User Interface elements and
document’s appearance in the viewer, like rotation and
zooming, are managed client-side. The user can
perform a large variety of actions, like creating annotations and
We’re constantly looking for the latest technologies on the
comments, and scan and print documents.
market, and we often end up creating our path when we feel
Our customers have been using our GdPicture.NET SDK for something is missing in the industry.
desktop applications, for many years, and with great success.
Some of them have already made the transition to web applica- Q DocuVieware is a product of your company, ORPALIS.
tions. That’s why we have developed a tool which is comprehen- Do you have any other project coming up?
sive and fully customizable, yet easy to integrate.
A We keep on improving our SDK offer by implementing
DocuVieware is powered by our GdPicture.NET Document new functionalities in GdPicture.NET and DocuVieware on a
Imaging and Image Processing SDK. To build DocuVieware, we weekly basis.
have developed a layer containing features specific to web
We have a strong expertise in formats: we provide full PDF
applications, which encapsulates GdPicture.NET and its three
support and manage more than a hundred formats. We are
thousand functionalities.
also specialized in symbol recognition. Our OCR, barcoding,
and MICR engines are recognized worldwide for their speed
Q How do you make DocuVieware evolve? and accuracy.
A DocuVieware is growing every day thanks to customer We are currently working on a new project that combines our
feedback: if there is something our toolkit doesn’t do yet, and you
technologies and our knowledge in web infrastructures.
want it included, just ask us, and will see how we can implement it.
DocuVieware is already built with a scalable architecture,
Customers under maintenance can influence our roadmap
meaning that your web application developed with DocuVieware
significantly to make DocuVieware tailored to their needs.
will work the same if you’re hosting it on one or a thousand
For version 3, we have tried to focus on making collaborative servers. Our future innovative platform will offer access to the
work an easy task for everyone, thanks to comment statuses and same components available in GdPicture.NET and DocuVieware
discussion support. Many other features and improvements are (plus a lot of other new innovative features for manipulating
also available with this new release. documents), in a REST API.

To access your 60 days free trial, go to g www.docuvieware.com

Untitled-3 1 7/7/17 12:31 PM


Following the 12-byte header is a list of n index entries, where At offset 034H is the 20-byte SHA-1 value for the blob object:
n matches the number of entries described by the index header. 1ff0c423042b46cb1d617b81efb715defbe8054d.

The format for each index entry is presented in Figure 7. Git sorts Remember, this SHA-1 points to the blob object that contains
index entries in ascending order based on the path/file name field. the file contents for the file in question: .gitattributes.
The first 8 bytes represent the time the file was created as an off­ At 048H is a 2-byte value containing two 1-bit flags, a 2-bit merge-
set from midnight of Jan. 1, 1970. The second 8 bytes represent the stage value, and a 12-bit length of the path/file name for the current
time the file was modified as an offset from midnight of Jan. 1, 1970. index entry. Of the two 1-bit flags, the high-order bit designates
Next are five 4-byte values (device, inode, mode, user id and group whether the index entry has its assume-unchanged flag set (typi­
id) of file-attribute metadata related to the host OS. The only value cally done using the Git update-index plumbing command); the
used under Windows is the mode, which most often will be the low-order bit indicates whether another two bytes of data precede
octal 100644 I mentioned earlier when showing output from the the path\file name entry—this bit can be 1 only for index versions
ls-files command (this converts to the 4-byte 814AH value, which 3 and higher). The next 2 bits hold a merge-stage value from 0 to
you can see at position 26H in Figure 5). 3, as described earlier. The 12-bit value contains the length of the
Following the metadata is the 4-byte length of the file contents. path\file name string.
In Figure 5, this value starts at 030, which shows 00 00 0A 15 If the extended flag was set, a 2-byte value holds the skip-work­
(2,581 decimal)—the length of the .gitattributes file on my system: tree and intent-to-add bit flags, along with filler placeholders.
05/08/2017 09:24 PM <DIR> . Finally, a variable length sequence of bytes contains the path\file
05/08/2017 09:24 PM <DIR> ..
05/08/2017 09:24 PM 2,581 .gitattributes
name. This value is terminated with one or more NUL characters.
05/08/2017 09:24 PM 4,565 .gitignore Following that termination is the next blob object in the index or
05/08/2017 09:24 PM <DIR> MSDNConsoleApp
05/08/2017 09:24 PM 1,009 MSDNConsoleApp.sln
one or more index extension entries (as you’ll see shortly).
3 File(s) 8,155 bytes Earlier, I mentioned that Git doesn’t build tree objects until you
3 Dir(s) 92,069,982,208 bytes free commit what’s been staged. What that means is the index starts
out with only path/file names and references to blob
Figure 7 The Git Index File-Index Entry Data Format objects. As soon as you issue a commit, however, Git
Index File - Index Entry updates the index so it contains references to the tree
4 bytes 32-bit created time in Number of seconds since Jan. 1, 1970, objects it created during the last commit. If those
seconds 00:00:00. directory references still exist in your working direc­
4 bytes 32-bit created time - Nanosecond component of the created tory during the next commit, the cached tree object
nanosecond component time in seconds value. references can be used to reduce the work Git needs to
4 bytes 32-bit modified time in Number of seconds since Jan. 1, 1970, do during the next commit. As you can see, the role of
seconds 00:00:00. the index is multifaceted, and that’s why it’s described
4 bytes 32-bit modified time - Nanosecond component of the created as an index, staging area and cache.
nanosecond component time in seconds value. The index entry shown in Figure 7 supports only
4 bytes device Metadata associated with the file—these blob object references. To store tree objects, Git uses
4 bytes inode originate from file attributes used on the an extension.
Unix OS.
4 bytes mode
4 bytes user id Index Extensions
4 bytes group id
The index can include extension entries that store
specialized data streams to provide additional infor­
4 bytes file content length Number of bytes of content in the file.
mation for the Git engine to consider as it monitors
20 bytes SHA-1 Corresponding blob object’s SHA-1 value.
files in the working directory and when it prepares the
2 bytes Flags (High to low bits) next commit. To cache tree objects created during the
1 bit: assume-valid/assume-unchanged
flag
last commit, Git adds a tree extension object to the
1-bit: extended flag (must be 0 for index for the working directory’s root as well as for
versions less than 3; if 1 then an each sub-directory.
additional 2 bytes follow before the path\ Figure 5, Marker 2, shows the final bytes of the
file name) index and captures the tree objects that are stored
2-bit: merge stage
12-bit: path\file name length (if less than
in the index. Figure 8 shows the format for the tree-­
0xFFF) extension data.
2 bytes Flags (High to low bits) The tree-extension data header, which appears at off­
(version 3 1-bit: future use set 284H, is composed of the string “TREE” (marking
or higher) 1-bit: skip-worktree flag (sparse checkout) the start of the cached tree extension data) followed by
1-bit: intent-to-add flag (git add -N) a 32-bit value that indicates the length of the extension
13-bit: unused, must be zero
data that follows. Next are entries for each tree entry:
Variable Length Path/file name NUL terminated The first entry is a variable-length null-terminated

32 msdn magazine DevOps

0817msdn_WaldmanGit_v3_26-33.indd 32 7/12/17 11:57 AM


INTENSE TRAINING FOR DEVELOPERS, ENGINEERS,
CHICAGO PROGRAMMERS, ARCHITECTS AND MORE!
SEPT 18-21, 2017 ➤ Visual Studio / .NET ➤ ASP.NET / Web Server
DOWNTOWN MARRIOTT Framework ➤ Agile
➤ JavaScript / HTML5 Client ➤ ALM / DevOps
➤ Native Client ➤ Cloud Computing
➤ Software Practices ➤ Windows Client
➤ Database and Analytics ➤ Xamarin
➤ Angular JS

Register by August 18 REGISTER


and Save $200! NOW
Use promo code VSLCHTI

EVENT PARTNER GOLD SPONSOR SUPPORTED BY PRODUCED BY


vslive.com/chicago
magazine

0817msdn_VSLive_Insert.indd 1 6/19/17 12:49 PM


ANAHEIM, CA
OCT 16-19, 2017
HYATT® REGENCY
A Disneyland Good Neighbor Hotel

INTENSE TRAINING FOR DEVELOPERS, ENGINEERS,


PROGRAMMERS, ARCHITECTS AND MORE!
➤ Visual Studio / .NET ➤ ASP.NET Core
➤ JavaScript / HTML5 ➤ Web API
➤ Angular ➤ ALM / DevOps
➤ Native Mobile & Xamarin ➤ Cloud Computing
➤ Software Practices ➤ UWP
➤ Database and Analytics ➤ Unit Testing

Register by August 25 REGISTER


NOW
and Save $300!
Use promo code VSLANTI

EVENT PARTNER SUPPORTED BY PRODUCED BY


vslive.com/anaheim
magazine

0817msdn_VSLive_Insert.indd 2 6/19/17 12:49 PM


string value for the tree path (or simply NUL for the Figure 8 The Git Index File Tree-Extension Object Data Format
root tree). The following value is an ASCII value, so it
Index File - Cached Tree-Extension Header
is to be read as the “7” you see in the hex editor—the
4 bytes TREE Fixed signature for a cached
number of blob entries covered by the current tree tree-extension entry.
(because this is the root tree, it has the same num­
4 bytes 32-bit number
ber of entries you saw earlier when issuing the Git representing the length
ls-files stage command). The next character is a space, of TREE extension data
followed by another ASCII number to represent the
number of subtrees the current tree has. Cached Tree-Extension Entry
The root tree for our project has only 1 subtree: Variable Path NUL-terminated path string (null only for
the root tree).
MSDNConsoleApp. This value is followed by a line­
feed character, then the SHA-1 for the tree. The SHA-1 ASCII number Number of entries ASCII number representing the number
of entries in the index covered by this
starts at offset 291, beginning with 0d21e2. tree entry.
Let’s confirm that 0d21e2 is actually the root tree
1 byte 20H (space character)
SHA-1. To do that, go to the command window
ASCII number Number of subtrees ASCII number representing the number
and enter:
of subtrees this tree has.
git log
1 byte 0AH (linefeed character)
This displays details of the recent commits:
commit 5192391e9f907eeb47aa38d1c6a3a4ea78e33564 20 bytes Tree object’s SHA-1 SHA-1 values of the tree object this
Author: Jonathan Waldman <[email protected]> entry produces.
Date: Mon May 8 21:24:15 2017 -0500

Add project files. and .git\info\exclude files and by the file pointed to by core.
commit dc0d3343fa24e912f08bc18aaa6f664a4a020079
excludesfile). It has the signature “UNTR.”
Author: Jonathan Waldman <[email protected]> • One to support a split-index mode in order to speed index
Date: Mon May 8 21:24:07 2017 -0500
updates for very large index files. It has the signature “link.”
Add .gitignore and .gitattributes. The index’s extension feature makes it possible to continue
The most recent commit is the one with the timestamp 21:24:15, adding to its capabilities.
so that’s the one that last updated the index. I can use that commit’s
SHA-1 to find the root-tree SHA-1 value: Wrapping Up
git cat-file -p 51923 In this article, I reviewed the Git three-tree architecture and delved into
This generates the following output: details behind its index file. I showed you that Git updates the index
tree 0d21e2f7f760f77ead2cb85cc128efb13f56401d in response to certain operations and that it also relies on infor­
parent dc0d3343fa24e912f08bc18aaa6f664a4a020079
author Jonathan Waldman <[email protected]> 1494296655 -0500 mation the index contains in order to carry out other operations.
committer Jonathan Waldman <[email protected]> 1494296655 -0500 It’s possible to use Git without thinking much about the index.
The preceding tree entry is the root tree object. It confirms that Yet having knowledge about the index provides invaluable insight
the 0d21e2 value at offset 291H in the index dump is, in fact, the into Git’s core functionality while shedding light on how Git detects
SHA-1 for the root tree object. changes to files in the working directory, what the staging area is
The other tree entries appear immediately after the SHA-1 value, and why it’s useful, how Git manages merges, and why Git performs
starting at offset 2A5H. To confirm the SHA-1 values for cached some operations so quickly. It also makes it easy to understand com­
tree objects under the root tree, run this command: mand-line variants of the check out and rebase commands—and
git ls-tree -r -d master
the difference between soft, mixed and hard resets. Such features
This displays only the tree objects, recursively on the current branch: let you specify whether the index, working directory, or both the
040000 tree c7c367f2d5688dddc25e59525cc6b8efd0df914d MSDNConsoleApp
040000 tree 2723ceb04eda3051abf913782fadeebc97e0123c MSDNConsoleApp/Properties
index and working directories should be updated when issuing
The mode value of 040000 in the first column indicates that this certain commands. You’ll see such options when reading about
object is a directory rather than a file. Git workflows, strategies and advanced operations. The purpose
Finally, the last 20 bytes of the index contain an SHA-1 hash rep­ of this article is to orient you to the important role the index plays
resenting the index itself: As expected, Git uses this SHA-1 value so you can better digest the ways in which it can be leveraged. n
to validate the data integrity of the index.
While I’ve covered all of the entries in this article’s example index Jonathan Waldman is a Microsoft Certified Professional who has worked with
Microsoft technologies since their inception and who specializes in software ergo-
file, larger and more complex index files are the norm. The index
nomics. Waldman is a member of the Pluralsight technical team and he currently
file format supports additional extension data streams, such as: leads institutional and private-sector software-development projects. He can be
• One that supports merging operations and merge-conflict res­ reached at [email protected].
olution. It has the signature “REUC” (for resolve undo conflict).
• One for maintaining a cache of untracked files (these are Thanks to the following Microsoft technical experts for reviewing this article:
files to be excluded from tracking, specified in the .gitignore Kraig Brockschmidt, Saeed Noursalehi, Ralph Squillace and Edward Thomson

msdnmagazine.com August 2017 33

0817msdn_WaldmanGit_v3_26-33.indd 33 7/12/17 11:57 AM


MICROSOFT OFFICE

Actionable Messages
for Outlook
Woon Kiat Wong

I love e-mail. At work, it’s where I go to stay on top of what’s submits an expense report, an e-mail message is sent to the
going on and what I need to do. It’s where I receive notifications manager for approval. I’ll walk through the steps on how to use
of new expense reports submitted by my team, new replies to my Actionable Messages in Outlook that lets the manager approve the
tweets, new comments to my pull requests and so on. But e-mail request within the e-mail message itself.
could be so much better. Why do I need to click a link in e-mail
and wait for the finance system Web site to load in a browser My First Actionable Message
before I can approve an expense report? Why do I have to men- In Figure 1, you see the HTML of an Actionable Message. It might
tally change my context? I should be able to approve the expense look complicated, but believe me, it’s not. I’ll explain the markup in
report directly in the context of my e-mail client. detail in the following sections. The first step is to send an e-mail
Sound familiar? Outlook is about to make your life much better, with the markup from Figure 1 to your Office 365 e-mail account.
save you time and make you more productive. As shown in Figure 2, in the message itself, there’s a message
card with two buttons with which you can interact. If you click
Introducing Actionable Messages on the Approve button, it’ll result in an error for now because you
Actionable Messages let users complete tasks within the e-mail itself. haven’t yet specified the URL for the action. You’ll add the URL
It offers a native experience in both the Outlook desktop client later. If you click on the View Expense button, a browser will open
and Outlook Web Access (OWA). In this article, I’ll use the word and navigate to the Expense Approval Web site.
Outlook to mean either Outlook desktop client or OWA.
In the example I’ll be using, the fictional company Contoso MessageCard Markup
has an internal expense approval system. Every time an employee The e-mail message itself is typical HTML markup. To make it an
Actionable Message in Outlook, you insert MessageCard markup
This article discusses: in the <script> element. One main advantage of this approach is
• Create and send Actionable Messages that e-mail messages will continue to render as usual on clients
that don’t recognize the MessageCard markup. The format of this
• Validating a JSON Web token
markup is called JSON-LD, which is a standard format to create
• Designing Web services for Actionable Messages
machine-readable data across the Internet. Now, let’s go through
Technologies discussed: the markup in detail. These two lines of code are mandatory in
Actionable Message Markup, Actionable Message JSON Web every markup:
Token, Actionable Message Security "@context": "http://schema.org/extensions",
"@type": "MessageCard",

34 msdn magazine

0817msdn_WongOutlook_v4_34-43.indd 34 7/12/17 11:58 AM


magazine VP MSDN MAGAZINE VENDOR PROFILE
MSDN Magazine Vendor Profile

The Build vs. Buy Data Quality


Challenge for Optimizing Accuracy
Q&A with Bud Walker, Vice President of Enterprise Sales & Strategy

Q Who is Melissa, and what solutions do you provide? require complex parsing rules and multisourced reference data
for verification. For instance, the telephone number 949-555-
A Melissa Global Intelligence was founded in 1985 as Melissa
5659 might look valid from a rules-based perspective, but is it
Data. For over three decades, we have been a leading provider of
actually callable, and associated with the right customer?
data quality, ID verification, and data management solutions. Our
software, cloud services, and data integration components Active data quality relies on deep domain knowledge of contact
leverage comprehensive and authoritative reference data to and location data to parse, format, cleanse, enrich, and match
profile, verify, standardize, consolidate, match/dedupe, enrich, customer records to provide the organization with accurate,
and update U.S. and global contact data, including names, timely, and actionable information.
addresses, phone numbers and email addresses for improved
analytics, efficient operations, and strong customer relationships.
Q Should companies build out a DQ solution or purchase an
off-the-shelf one from a trusted vendor?
Q What are the most common causes of poor data quality?
A We recommend a Hybrid approach. Companies should look
A The major cause of bad data is from typographical errors and
to Build their own data quality solutions to handle rules-based
non-conforming data entered during the data entry process—
data quality processes—those where they have the expertise
either by employees or customers filling out contact forms. The
and experience with their own internal data. Where active data
next biggest cause of bad data comes from migrating data due
quality is required, a Buy approach will usually be more efficient
to irregular, missing, or misplaced data values that cause
and cost-effective in the long run. Combining both Build and
surprises. Bad data costs businesses between 10-25% of revenue
Buy can result in the best-of-both world results. We urge organi-
each year, and in 2016 cost US businesses over $3.1 trillion. That’s
zations to strive for small wins first—solve one problem at a
why implementing real-time validation of contact information
time in discreet phases. This will help you show tangible results
like addresses, email, phone, and other important information is
quickly and get the buy-in you need for future projects. Now,
essential, as well as establishing a data governance team in
rinse and repeat.
charge of understanding the impact of data quality.

Q What is the difference between rules-based and active Q How are Melissa’s solutions employed?
data quality, and why is active data quality so important? A Melissa offers every kind of integration option you can
A Data that is mostly static, internally generated and controlled— imagine. We have on-prem APIs and Cloud services that allow
like KPIs—employee performance metrics, new product develop- you to build into existing or custom applications. We also offer
ment, supplier payment optimization, and inventory reduction plugins for data integration platforms like SQL Server®, Pentaho®
data, can usually be well managed by rules-based validation. and Talend®, and CRM software like Salesforce® and Dynamics®
CRM. Our smart, sharp tools approach means we can help you
Active data like customer names, addresses, emails, phone create the best solutions based on your budget and needs and
numbers, company names, and job titles, constantly change and achieve data quality without breaking the bank.

For more information, please visit g www.melissa.com

Untitled-5 1 7/10/17 2:28 PM


Figure 1 HTML of an Outlook Actionable Message all the information a user would need or the content of the card is
<html>
redundant with the content of the e-mail body. In case the mes-
<head> sage is viewed in an e-mail client that doesn’t understand message
<meta http-equiv="Content-Type" content="text/html; charset=utf8">
<script type="application/ld+json">{
cards, then the original body will be shown and the message card
"@context": "http://schema.org/extensions", will not, regardless of the value of “hideOriginalBody.” The value
"@type": "MessageCard",
"hideOriginalBody": "true",
of the property “title” is the title of the MessageCard:
"title": "Expense report is pending your approval", "hideOriginalBody": "true",
"sections": [{ "title": "Expense report is pending your approval",
"text": "Please review the expense report below.",
"facts": [{
Next is “sections.” You can think of a section as representing
"name": "ID", an “activity.” If your card has multiple activities you should defi-
"value": "98432019"
}, {
nitely use multiple sections, one per activity. Figure 3 shows
"name": "Amount", markup with one section. You use the facts property of a section,
"value": "83.27 USD"
}, {
which is an array of name-value pairs, to display the details of an
"name": "Submitter", expense report.
"value": "Kathrine Joseph"
}, {

It would be great if you could


"name": "Description",
"value": "Dinner with client"
}]

visualize how the card looks when


}],
"potentialAction": [{
"@type": "HttpPost",

you’re authoring the markup.


"name": "Approve",
"target": ""
}, {
"@type": "OpenUri",
"name": "View Expense",
"targets": [ { "os": "default", Next is “potentialAction.” This is an array of actions that can be
"uri": "https://expense.contoso.com/view?id=98432019"} ] invoked on this card. Currently the supported actions are OpenUri
}]
} and HttpPOST:
</script> "potentialAction": [{
</head> "@type": "HttpPost",
<body> "name": "Approve",
<p>Please <a href="https://expense.contoso.com/view?id=98432019">approve</a> "target": ""
expense report #98432019 for $83.27.</p> }, {
</body> "@type": "OpenUri",
</html> "name": "View Expense",
"targets": [ { "os": "default",
"uri": "https://expense.contoso.com/view?id=98432019"} ]
You set the context to http://schema.org/extensions and the type }]
to “MessageCard.” The MessageCard type indicates that this e-mail The OpenUri action will open a browser and navigate to the
is an Actionable Message. URL specified in the targets property. The targets property is an
Next is the property “hideOriginalBody.” When the value is set array that lets you specify platform-specific URLs. For example,
to true, the e-mail body is hidden and only the card is displayed, you might want users on iOS and Android to navigate to different
as shown in Figure 2. This is useful when the card itself contains URLs. In this example, you set the OS to default, which means the
URL is the same for all platforms.
The HttpPOST action will make
an HTTP POST request to an
external Web service specified in
the target property. Currently the
value is empty. That’s why you see
an error when you click on the
Approve button.

MessageCard
Playground App
It would be great if you could
visualize how the card looks when
you’re authoring the markup.
Microsoft has a Web app that lets
you do just that. It’s called the
MessageCard Playground App
Figure 2 Actionable Message in Outlook Web Access (bit.ly/2s274S9).
36 msdn magazine Microsoft Office

0817msdn_WongOutlook_v4_34-43.indd 36 7/12/17 11:58 AM


/update/2017/08
Experience the brand new look at www.componentsource.com

BEST SELLER Aspose.Total for .NET from $2,939.02


Create, Edit & Manipulate Word, Excel, PDF and PowerPoint Files in your .NET apps.
• Aspose.Total for .NET contains every Aspose .NET API in one complete package
• Also works with MS Visio, MS OneNote, MS Project, Email, Images, Barcodes, OCR & OMR
• Mail Merge, Document Assembly, Text Extraction, PST & OST Creation and File Conversion
• Create and Recognize Barcodes, Control Text Formatting, Paragraphs, Images and Tables
• Now includes the new Aspose.CAD and Aspose.3D APIs

BEST SELLER DevExpress DXperience 17.1 from $1,439.99


The complete range of DevExpress .NET controls and libraries for all major Microsoft platforms.
• WinForms - New TreeMap control, Chart series types and Unbound Data Source
• WPF - New Wizard control and Data Grid scrollbar annotations
• ASP.NET - New Vertical Grid control, additional Themes, Rich Editor Spell Checking and more
• Windows 10 Apps - New Hamburger Sub Menus, Splash Screen and Context Toolbar controls
• CodeRush - New debug visualizer expression map and code analysis diagnostics

BEST SELLER LEADTOOLS Document Imaging SDKs V19 from $2,995.00 SRP
Add powerful document imaging functionality to desktop, tablet, mobile & web applications.
• Universal document viewer & conversion framework for PDF, Office, CAD, TIFF & more
• OCR, MICR, OMR, ICR and Forms Recognition supporting structured & unstructured forms
• PDF SDK with text edit, hyperlinks, bookmarks, digital signature, forms, metadata
• Barcode Detect, Read, Write for UPC, EAN, Code 128, Data Matrix, QR Code, PDF417
• Zero-footprint HTML5/JavaScript UI Controls & Web Services

BEST SELLER Help & Manual Professional from $586.04


Help and documentation for .NET and mobile applications.
• Powerful features in an easy, accessible and intuitive user interface
• As easy to use as a word processor, but with all the power of a true WYSIWYG XML editor
• Single source, multi-channel publishing with conditional and customized output features
• Output to responsive HTML, CHM, PDF, MS Word, ePUB, Kindle or print
• Styles and Templates give you full design control

We accept purchase orders.


© 1996-2017 ComponentSource. All Rights Reserved. All prices correct at the time of press. Online prices may vary from those shown due to daily fluctuations & online discounts. Contact us to apply for a credit account.

US Headquarters
ComponentSource
European Headquarters
ComponentSource
Asia / Pacific Headquarters
ComponentSource Sales Hotline - US & Canada:
(888) 850-9911
650 Claremore Prof Way 2 New Century Place 7F Kojimachi Ichihara Bldg
Suite 100 East Street 1-1-8 Hirakawa-cho
Woodstock Reading, Berkshire Chiyoda-ku
GA 30188-5188 RG1 4ET Tokyo, 102-0093
USA United Kingdom Japan www.componentsource.com

Untitled-1 1 7/5/17 10:27 AM


You should always design your card in the app first. Once you’re ActionCard Action
happy with the card layout, you can then use the markup in your Now let’s add a Reject button so users can reject an expense report.
e-mail messages. For reject, you need additional input from users to explain why
the expense report is rejected.
Calling an External Web Service The ActionCard action is designed for such scenarios. It con-
with HttpPOST Action tains one or more inputs and associated actions that can be either
Now you have a message card with two actions. The OpenUri will open OpenUri or HttpPost. You insert an ActionCard action in between
a browser and navigate to the URL specified in the action. For the Http- HttpPOST and OpenUri, as shown in Figure 4.
POST action, you’d like it to call your REST API that will approve the
expense report. You replace the HttpPOST action with the following:
{
Actionable Messages will work
"@type": "HttpPost",
"name": "Approve",
"target": "https://api.contoso.com/expense/approve",
with any Web service that can
"body": "{ \"id\": \"98432019\" }"
}
When a user clicks on the Approve button, a Microsoft server
handle HTTP POST requests.
will make an HTTP POST request that’s similar to the following:
POST api.contoso.com/expense/approve
If you send yourself the updated markup, there are Approve,
Content-Type: application/json Reject and View Expense buttons. If you click on the Reject button,
{ "id": "98432019" }
you can now enter comments before you reject the expense report.
The target is the URL, which the Microsoft server is going to Let’s take a look at the ActionCard action markup. Besides the
make a POST request to, and the body is the content of the request. type and name properties, it has an array of inputs and actions. In
The body content is always assumed to be JSON. this example, you have a multiline TextInput that lets users enter
Now you’ll send yourself an e-mail with the new markup. When text. The other supported inputs are DateInput and Multichoice­
you click on the Approve button, the action is completed successfully. Input. For more details, refer to bit.ly/2t3bLJN.
You have an HttpPOST action that will make a call to the exter-
Figure 3 Card with One Section nal Web service to reject the expense report. This is similar to the
"sections": [{ HttpPOST action for the approve action. One major difference is
"text": "Please review the expense report below.", that you want to pass the comments entered by users to the Web
"facts": [{
"name": "ID", service call. You can reference to the value of the text input by
"value": "98432019" using {{rejectComment.value}}, where rejectComment is the ID
}, {
"name": "Amount", of the text input.
"value": "83.27 USD"
}, {
"name": "Submitter", Web Service for Actionable Messages
"value": "Jonathan Kiev" So far you’ve seen the markup for Actionable Messages in Outlook
}, {
"name": "Description", and how it works. In the rest of the article, I’ll describe how a
"value": "Dinner with client" Web service should handle requests coming from Actionable
}]
}], Messages in Outlook.
Actionable Messages will work with any Web service that can handle
Figure 4 ActionCard Action HTTP POST requests. In this example, your Web service is an API
controller in ASP.NET MVC. Figure 5 shows your API controller.
"potentialAction": [{
"@type": "HttpPost", There are two methods in this API controller, one for approval
... and another for rejection. The Web service must return an HTTP
}, {
"@type": "ActionCard", status code of 2xx for the action to be considered successful. The
"name": "Reject", Web service can also include the CARD-ACTION-STATUS
"inputs": [{
"@type": "TextInput", header in the response. The value of this header will be displayed
"id": "comment", to the user in a reserved area of the card. If you deploy the Web
"isMultiline": true,
"title": "Explain why the expense report is rejected" service to https://api.contoso.com and you click on the Approve
}], button, you’ll get the notification that the operation was completed
"actions": [{
"@type": "HttpPOST", successfully, as shown in Figure 6.
"name": "Reject", You now have the Actionable Message working end to end. You
"target": "https://api.contoso.com/expense/reject",
"body": "{ \"id\": \"98432019\", \"comment\": \"{{rejectComment.value}}\" }" can send out the Actionable Message and when the user clicks on
}] the Approve button, an HTTP POST request is made to your Web
},{
"@type": "OpenUri", service. Your Web service will process the request and return 200
... OK. Outlook will then mark the action as done. Next, I’ll look at
}]
how you can secure your Web service.
38 msdn magazine Microsoft Office

0817msdn_WongOutlook_v4_34-43.indd 38 7/12/17 11:58 AM


Figure 5 Expense API Controller {
"@type": "HttpPost",
[RoutePrefix("expense")] "name": "Approve",
public class ExpenseController : ApiController "target": "https://api.contoso.com/expense/approve",
{ "body": "{ \"id\": \"98432019\", \"token\": \
[HttpPost] "d8a0bf4f-ae70-4df6-b129-5999b41f4b7f\" }"
[Route("approve")] }
public HttpResponseMessage Approve([FromBody]JObject jBody)
{
string expenseId = jBody["id"].ToString(); Bearer Token
// Process and approve the expense report.
While limited-purpose tokens make it harder for attackers to forge a
HttpResponseMessage response = this.Request.CreateResponse(HttpStatusCode.OK); request, they’re still not perfect. Ideally, a Web service should be able
response.Headers.Add("CARD-ACTION-STATUS", "The expense was approved.");
tell whether an HTTP POST request is coming from a Microsoft
return response; server instead of some unauthorized, potentially malicious server.
}
Microsoft solves this problem by including a bearer token in
[HttpPost] every HTTP POST request it sends to Web services. The bearer
[Route("reject")]
public HttpResponseMessage Reject([FromBody]JObject jBody)
token is a JSON Web Token (JWT) and it’s included in the Autho-
{ rization header of a request. When a user clicks on the Approve
string expenseId = jBody["id"].ToString();
string comment = jBody["comment"].ToString();
button, the Web service will receive a request that looks like this:
POST https://api.contoso.com/expenses/approve
// Process and reject the expense report.
HttpResponseMessage response = this.Request.CreateResponse(HttpStatusCode.OK); Content-Type: application/json
response.Headers.Add("CARD-ACTION-STATUS", "The expense was rejected."); Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJ­SUzI1NiIsIng1dCI6I­jhx­Z3A4­VER­
CbDJINkp5­RkU0WjM0­ZDJoYS1rR­SIsImtpZCI6I­jhxZ3A4V­ERCbDJINkp5RkU0WjM0ZD­JoYS1rRSJ9.
return response; eyJpYXQiOjE0ODQwODkyNzksInZlciI6IlNUSS5FeHRlcm­5hbEFjY2Vzc1Rva2­VuLlYxIiwiYXBw
} aWQiOiI0OGFmMD­hkYy1mN­mQyLTQzNWYtYjJhNy0wN­jlhYmQ5OWMwODYiLCJzd­WIiOiJk­YXZpZEBj­
} b250b3NvLmN­vbSIsImFwcGlk­YWNyIjoiMiIsIm­FjciI6IjAi­LCJzZW5kZ­XIiOiJleHB­
lbnNlYXBw... (truncated for brevity)

{
"id": "98432019",
"token": "d8a0bf4f-ae70-4df6-b129-5999b41f4b7f"
Limited-Purpose Tokens }
What follows “Bearer” in the Authorization header is a long base-64
Because the expense ID usually follows a certain format, there’s a risk
encoded string that’s a JSON Web Token (JWT). You can decode the
that an attacker can perform an attack by posting a lot of requests with
JWT at jwt.calebb.net. Figure 7 shows a sample token after it’s decoded.
different expense IDs. If an attacker successfully guesses an expense
Every JWT has three segments, separated by a dot (.). The first
ID, the attacker might be able to approve or reject that expense report.
segment is the header, which describes the cryptographic oper-
Microsoft recommends developers use “limited-purpose tokens” as
ations applied to the JWT. In this case, the algorithm (alg) used
part of the action target URL or in the body of the request. The limited-­
to sign the token is RS256, which means RSA using the SHA-256
purpose token should be hard for attackers to guess. For example, I
hash algorithm. The x5t value specifies the thumbprint of the key
use a GUID, a 128-bit number as the limited-purpose token. This
used to sign the token.
token can be used to correlate service URLs with specific requests and
The second segment is the payload itself. It has a list of claims
users. It can also be used to protect Web services from replay attacks
that the token is asserting. Web services should use these claims to
(bit.ly/2sBQmdn). You update the markup to include a GUID in the body:
verify a request. The table in Figure
8 describes these claims.
The third segment is the digital
signature of the token. By verify-
ing the signature, Web services can
be confident that the token is sent
by Microsoft and trust the claims
in the token.
Verifying a digital signature is a
complex task. Fortunately, there’s
a library on NuGet that makes the
verification task easy. The library
is available at bit.ly/2stq90c and it’s
authored by Microsoft. Microsoft
also published code samples for
other languages on how to verify
the token. Links for these code
samples are available at the end
Figure 6 Expense Report with Successful Approval Notification of this article.
msdnmagazine.com August 2017 39

0817msdn_WongOutlook_v4_34-43.indd 39 7/12/17 11:58 AM


Untitled-7 2 6/5/17 1:38 PM
Untitled-7 3 6/5/17 1:38 PM
Figure 7 A Sample Bearer JSON Web Token of the aud (audience) claim. It basically means the token is
{
issued for the intended audience, which is your Web service but
typ: "JWT", not any other Web service. In this case, the API to be called is
alg: "RS256",
x5t: "8qgp8TDBl2H6JyFE4Z34d2ha-kE",
http://api.contoso.com/expense/approve. The value in the claim
kid: "8qgp8TDBl2H6JyFE4Z34d2ha-kE" will be the base URL, which is https://api.contoso.com.
}.
{
The method will return an instance of ActionableMessage­
iat: 1484089279, TokenValidationResult. First, you’ll check the property
ver: "STI.ExternalAccessToken.V1",
appid: "48af08dc-f6d2-435f-b2a7-069abd99c086",
sub: "[email protected]", Figure 9 The VerifyBearerToken Method
appidacr: "2",
acr: "0", private async Task<HttpStatusCode> VerifyBearerToken(
sender: "[email protected]", HttpRequestMessage request, string serviceBaseUrl, string expectedSender)
iss: "https://substrate.office.com/sts/", {
aud: "https://api.contoso.com", if (request.Headers.Authorization == null ||
exp: 1484090179, !string.Equals(request.Headers.Authorization.Scheme, "bearer",
nbf: 1484089279 StringComparison.OrdinalIgnoreCase) ||
}. string.IsNullOrEmpty(request.Headers.Authorization.Parameter))
[signature] {
return HttpStatusCode.Unauthorized ;
}
After you include the NuGet package in the Web service project, string bearerToken = request.Headers.Authorization.Parameter;
you can use the VerifyBearerToken method, as shown in Figure ActionableMessageTokenValidator validator =
new ActionableMessageTokenValidator();
9, to verify the bearer token in a request. ActionableMessageTokenValidationResult result =
await validator.ValidateTokenAsync(bearerToken, serviceBaseUrl);

Verifying a digital signature is if (!result.ValidationSucceeded)


{
return HttpStatusCode.Unauthorized;

a complex task. Fortunately, }

if (!string.Equals(result.Sender, expectedSender,

there’s a library on NuGet that StringComparison.OrdinalIgnoreCase) ||


!result.ActionPerformer.EndsWith("@contoso.com",
StringComparison.OrdinalIgnoreCase))

makes the verification task easy. {


return HttpStatusCode.Forbidden;
}

First, the method verifies there’s a bearer token in the return HttpStatusCode.OK;
}
Authorization header. Then, it initializes a new instance of
Actionable­MessageTokenValidator and calls the ValidateToken­ [HttpPost]
[Route("approve")]
Async method. The method takes two parameters. The first one public async Task<HttpResponseMessage> Approve([FromBody]JObject jBody)
is the bearer token itself. The second one is the Web service {
HttpRequestMessage request = this.ActionContext.Request;
base URL. If you look at the decoded JWT, this is the value HttpStatusCode result = await VerifyBearerToken(
request, "https://api.contoso.com",
Figure 8 Description of Claims in Payload "[email protected]");

Claims Description switch (result)


{
iss The token issuer. The value should always be case HttpStatusCode.Unauthorized:
https://substrate.office.om/sts/. The Web service should reject return request.CreateErrorResponse(
HttpStatusCode.Unauthorized, new HttpError());
the token and the request if the value does not match.
appid The ID of the application which issues the token. The value case HttpStatusCode.Forbidden:
should always be 48af08dc-f6d2-435f-b2a7-069abd99c086. HttpResponseMessage errorResponse =
this.Request.CreateErrorResponse(HttpStatusCode.Forbidden, new HttpError());
The Web service should reject the token and the request if the errorResponse.Headers.Add("CARD-ACTION-STATUS",
value doesn’t match. "Invalid sender or the action performer is not allowed.");
return errorResponse;
aud The audience of the token. It should match the hostname of the
Web service URL. The Web service should reject the token and default:
the request if the value doesn’t match. break;
}
sub The subject who performed the action. The value will be the
e-mail address of the person who performed the action, if the string expenseId = jBody["id"].ToString();
e-mail address or any of the proxy e-mail addresses is in the
// Process and approve the expense report.
To: line. If none of the e-mail addresses is matched, this will be
the hashed value of the subject’s user principal name (UPN). It’s HttpResponseMessage response = this.Request.CreateResponse(HttpStatusCode.OK);
guaranteed to be the same hashed value for the same UPN. response.Headers.Add("CARD-ACTION-STATUS", "The expense was approved.");
sender The e-mail address of the original message sender. return response;
tid The tenant ID of the token issuer. }

42 msdn magazine Microsoft Office

0817msdn_WongOutlook_v4_34-43.indd 42 7/12/17 11:58 AM


ValidationSucceeded. If the validation succeeded, the value will Figure 11 The Approve Method Returns a Refresh Card
be true; otherwise, it’ll be false. private HttpResponseMessage CreateRefreshCard(
The result also includes two other properties that will be use- HttpRequestMessage request, string actionStatus,
string expenseID, string amount, string submitter, string description)
ful to third parties. The first one is Sender. This is the value of the {
sender claim in the token. This is the e-mail address of the account string refreshCardFormatString = "{\"@context\": \"http://schema.
org/extensions\",\"@type\": \"MessageCard\",\"hideOriginalBody\":
that sent the actionable message. The second one is the Action- \"true\",\"title\": \"Expense report #{0} was approved\",\"sections\":
Performer, which is the value of the sub claim. This is the e-mail [{\"facts\": [{\"name\": \"ID\",\"value\": \"{0}\"},{\"name\":
\"Amount\",\"value\": \"{1}\"},{\"name\": \"Submitter\",\"value\":
address of the person who performed the action. In this example, \"{2}\"},{\"name\": \"Description\",\"value\": \"{3}\"}]}]}";
only those with @contoso.com e-mail addresses can approve or string refreshCardMarkup = string.Format(
refreshCardFormatString,
reject an expense report. You can replace the code with a more expenseID,
complicated verification of your own. amount,
submitter,
description);
Refresh Card
HttpResponseMessage response = request.CreateResponse(HttpStatusCode.OK);
So far the only way to provide feedback to a user is through the Response.Headers.Add("CARD-ACTION-STATUS", actionStatus);
CARD-ACTION-STATUS header. The value of the header will response.Headers.Add("CARD-UPDATE-IN-BODY", "true");
response.Content = new StringContent(refreshCardMarkup);
be displayed to the user in a reserved area of the card. Another
option is to return a refresh card to the user. The idea is to replace return response;
}
the current action card with a different card. There are a few reasons
why you want to do that. For example, after an expense report is [HttpPost]
[Route("approve")]
approved, you don’t want users to be able to approve or reject the public async Task<HttpResponseMessage> Approve([FromBody]JObject jBody)
expense report again. Instead, you’d like to tell the user that the {
HttpRequestMessage request = this.ActionContext.Request;
expense report is already approved. Figure 10 shows the markup HttpStatusCode result = await VerifyBearerToken(
that you’ll return. request, "https://api.contoso.com",
"[email protected]");
You need to set the value of the header CARD-UPDATE-IN-BODY
to true so Microsoft servers know that the response has a refresh switch (result)
{
card. Figure 11 shows the Approve method returns a refresh card. case HttpStatusCode.Unauthorized:
return request.CreateErrorResponse(
Wrapping Up HttpStatusCode.Unauthorized, new HttpError());

Actionable Messages let users complete tasks within Outlook in case HttpStatusCode.Forbidden:
HttpResponseMessage errorResponse =
a secure way. It’s available in desktop Outlook and Outlook Web this.Request.CreateErrorResponse(
Access today and the feature is coming to Outlook for Mac and HttpStatusCode.Forbidden, new HttpError());
errorResponse.Headers.Add("CARD-ACTION-STATUS",
Outlook Mobile soon. It’s straightforward to implement Action- "Invalid sender or the action performer is not allowed.");
able Messages. First, you need to add the required markup to the return errorResponse;
e-mails you’re sending out. Second, you need to verify the bearer default:
token sent by Microsoft in your Web service. Actionable Messages break;
}
will make your users happier and more productive. There is
so much more about Actionable Messages than this article can string expenseId = jBody["id"].ToString();
cover. Visit bit.ly/2rAD6AZ for the complete references and links to // Process and approve the expense report.
the code samples.
return CreateRefreshCard(
Figure 10 Markup Returned to Expense Report with Refresh Card request,
"The expense was approved.",
{ "98432019",
"@context": "http://schema.org/extensions", "83.27 USD",
"@type": "MessageCard", "Jonathan Kiev",
"hideOriginalBody": "true", "Dinner with client");
"title": "Expense report #98432019 was approved", }
"sections": [{
"facts": [{
"name": "ID", I’d like to acknowledge Sohail Zafar, Edaena Salinas Jasso, Vas-
"value": "98432019"
}, { ant Kumar Tiwari, Mark Encarnacion and Miaosen Wang, who
"name": "Amount", helped review this article for grammar, spelling, and flow. n
"value": "83.27 USD"
}, {
"name": "Submitter",
"value": "Kathrine Joseph" Woon Kiat Wong is a software engineer from the Knowledge Technologies Group
}, { in Microsoft Research. He works closely with the Outlook team to deliver Action-
"name": "Description", able Messages. Contact him at [email protected].
"value": "Dinner with client"
}]
}]
}
Thanks to the following Microsoft technical experts for reviewing this article:
Pretish Abraham, David Claux, Mark Encarnacion and Patrick Pantel
msdnmagazine.com August 2017 43

0817msdn_WongOutlook_v4_34-43.indd 43 7/12/17 11:58 AM


AZURE

Batch Processing Using a


Serverless Architecture
Joseph Fultz

For Azure enterprise customers, a key challenge in can be done. I’ll touch on the overall design, key decision points
managing their cloud footprint is the ability to control what they and important things to consider in taking a serverless approach.
spend and to charge those costs back to the consumers. Fortu-
nately, there are several vendors that provide tools, such as Cloud Determining Consumption
Cruiser, Cloudyn, and Cloudability, to help with collecting usage The first decision is choosing between the Enterprise Agreement
data and generating a rich set of reports. Additionally, you can (EA) Billing API and the Azure Billing API, which centers its
find many good examples of how to pull data programmatically, requests around specific subscriptions. My prototype is targeted at
such as the post from a former co-worker of mine, Ed Mondek, in enterprise customers with multiple enrollments in an EA. In the
which he shows how to pull data into Excel and view it (bit.ly/2rzDOPI). scenario with which I’m working, subscriptions are being used as
However, if you want to pull that data regularly and enable his- part of the management boundaries for both specific product groups
torical, present trend and predictive views, you need to store a lot and for separating production from non-production resources.
more data. For a large enterprise with thousands of resources per This could result in a fairly high number of subscriptions in flux
subscription, that amount of data can be daunting and is certainly due to the volatile proof-of-concept (PoC) type of work being cre-
not what you’d want to fetch and keep on a local machine. ated as new groups and new product lines start up in Azure. Thus,
Luckily, there’s another way. In this article I’m going to walk you I chose to work with the EA API because it reduced the scope
through the serverless Extract-Transform-Load (ETL) process I set of work in that I don’t have to create a discovery mechanism for
up to extract such data, provide a little enrichment and store the data subscriptions. This leaves me with the noted challenge of not having
to a place where further work (analytics, map-reduce and so forth) any data for subscriptions created outside of the enrollments for
the enterprise. While this is an important area to tackle, it comes
with a number of other process and management challenges that
This article discusses:
have to be solved organizationally and is outside the scope of the
• Design considerations in a serverless architecture
work I want to accomplish.
• Integrating multiple Azure Platform-as-a-Service capabilities
• Billing data retrieval Requirements and Logical Flow
Technologies discussed: In any architecture, it’s the intersections between systems that require
Azure CosmosDB, Azure Functions, Azure Blob Storage
the most scrutiny in design and testing. A serverless architecture
doesn’t change the need to consider the volume of data that moves
44 msdn magazine

0817msdn_FultzBatch_v3_44-52.indd 44 7/12/17 11:55 AM


through the supersystem, and must take into account the partic- Thus, you see three major blocks that represent retrieval, enrich-
ular constraints of the discrete subsystems. The principal change ment and persistence, which are all separated by some queuing
in architecting such a supersystem is more in the depth or scope mechanism. The complications start after I make some technology
when defining the system, such as sizing a queue for throughput, picks and start looking at the details of implementing with those
but not sizing the hardware that hosts it. You must still consider components and making the processing pipeline run in parallel.
latency, connectivity, volume, availability, cost, and any number of
other factors, but the work of sizing and defining the particulars of Technology Mapping
the service ends once you’ve defined the capacity and the cost of At this point in the process, two factors beyond the requirements
the capability needed to meet the identified requirements. There’s of the overall system may come into play: enterprise standards
no additional work of defining the host environment and all its and personal preference. If these are in conflict, the result can be
needed artifacts as you might have done in the past. almost endless debate. Fortunately, in this instance I don’t have to
Before I get into designing what the overall flow of information worry about this. I do have my own mix of constraints, along with
into the system will look like, let’s note a few facts about the source those I noted from the initial requirements. In this case, I’d like to
systems and some requirements for the end-state system: make sure to hit these marks:
• All of the data for every subscription under the EA will be • Simplest compute provisioning and edits/updates for quick
returned for all resources for every day it’s available in the cycles of testing
designated month. This can result in a lot of data, with a linear •E  asy automatic scaling
growth as the month progresses. •E  asy provisioning for geographic distribution of data
• Any and all records may be updated throughout the month. •E  asy mechanisms for scheduling and triggering work
The stated settlement timing is 72 hours. As a point of safety, Here, I want to focus on the work and not on the system setup.
I’ll consider all records in flux for a given month until 72 I’ll leave things like cost analysis for various implementations
hours past the beginning of the subsequent month. and adherence to corporate standards until after I have a working
• The usage data isn’t returned with an ID for the enrollment, prototype. I did consider some alternatives, such as Azure SQL
so I’ll have to add it. Database versus Azure Cosmos DB, but I’m going to focus on my
• Determining cost is a separate activity and requires retrieving choices and the primary motivations for each of those choices.
the rate card and further processing. • Compute: Azure Functions will serve me well here. It meets
• No information will be received for subscriptions that aren’t my need for scale and simplicity while also providing easy
in the specified EA. configuration of scheduled and triggered jobs and easy
Additionally, there are a few technical business requirements integrations with bindings.
that the prototype must include: • Queuing: Keeping things simple, I’ll use Azure Storage Blobs
• The ability to create read-only and geographically distributed and separate the files by containers. The unknown but expect-
datasets must be included. edly large size of each initial input file makes storage queues
•P  rocessing performance should be adjustable for cost a non-option for initial retrieval, and likely takes them out
versus performance. of the running for processing individual subscription data
• The ability to secure access at the subscription level should splits. Beyond that, I’d like to keep the mechanism uniform
be designed in. and I really don’t need any advanced capabilities, such as
The overall flow itself is fairly simple in that I’m simply going to priority messages, routing, message-specific security and
retrieve the data, add a small amount of information and persist poisoned message handling.
it into the target storage. • Storage: Azure Cosmos DB is indeed my friend here. Using
As depicted in Figure 1, the path for getting the data to its tar- the subscription ID as the partition key allows me to limit
get is fairly simple because there’s no integration with any external access by subscription, if necessary. Additionally, the ease
systems other than the EA Billing API. I know that when I work of adding and removing read and read-write geographically
through the data, I’ll have to do some amount of initial processing distributed replicas and native support in Power BI makes
and enrichment (for example, add the enrollment ID), and on this a no-brainer for my system. Last, I have to admit a little
the persistence side I’ll have to deal with existing records from personal bias: I want a proper document storage mecha-
the previous day’s fetches. I’ll probably want to look at separating nism that supports the SQL syntax I’ve used for too many
those two processes. years to abandon.
Parallel Figure 2 represents the application of technology to the logical
architecture, as well as adding some processing flow to it.
Fetch Data Enrich and Split Enrich and Split
for Enrollment for Processing for Processing
I’ve taken the liberty of including the names I used in this dia­
gram, but you might not have names at this stage of the design.
The shapes used indicate the technology in play; the numbers on
Intake Queue Persistence Queue
the line are the sequence in which the process is executed, and
the arrows indicate which component initiates the outbound call.
Figure 1 Logical Flow Note that I’ve identified four Azure Functions, four Azure Storage
msdnmagazine.com August 2017 45

0817msdn_FultzBatch_v3_44-52.indd 45 7/12/17 11:55 AM


Blob Containers and three Azure
1 2
Cosmos DB collections that I’ll
employ as the working pieces of DailyEABatchControl
my implementation. 3
Separating the data into three
collections is useful for explain-
4 5
ing, but serves a grander purpose.
I won’t need the same security for EA Portal RetrieveUsage newdailyusage
(Billing API)
each of the types of documents and 6 { }
the separation makes that easy to
7 Enrollments
understand and manage. More im- newdailysplit
portant, I define the performance { } { }
DetailedUsageData
characteristics by collection and the SplitDailyUsage 8 EAUsage JobLog
separation allows me to more eas- { }
processedusage
ily optimize that by having a large
high-throughput collection specif- 9
ically for the DetailedUsageData, 11
while the other two remain minimal. ProcessDailyUsage processeddailysplit
10
Power BI Reporting 12
Retrieving Data
Starting with the first two legs of the Figure 2 Technology Map and Data Flow
data journey, I want to run some-
thing similar to what I do with a Cron job. While the WebJobs SDK collection to determine if a job has already been run for the day,
itself would support this type of implementation, it would leave as shown in Figure 3.
a lot of work of configuring the runtime environment to me and The last lamba results in a filtered list of enrollments for which
increase my overall development effort. Because Azure Functions data hasn’t been retrieved for the day in question. Next, I’ll call the
is built on top of the WebJobs SDK and naturally supports Timer RetrieveUsage (step 3 in Figure 2) from within DailyEABatch­
Trigger, it’s an easy choice. I could’ve used Azure Data Factory Control by calling it with HTTPClient with sufficient data in the
because it’s a tool made specifically for moving data around and it post body for it to know the enrollment for which it’s fetching
supports retrieving Web data and working with Blobs. However, data and the month for which it’s fetching it, as shown in Figure 4.
that would mean I’d need to work out certain things with regard It’s worth pointing out that this isn’t intended to be an open sys-
to reference data and updating duplicate records in Azure Cosmos tem. I’m creating a closed processing loop so I don’t want just any
DB when I don’t have the row ID. Familiarity with development caller executing the RetrieveUsage Function. Thus, I’ve secured it
and debugging using Azure Functions, and the information I can by requiring a code that’s not shown in Figure 4, but is part of the
get from Azure Functions integration with Application Insights, URI returned from GetEnvironmentVariable(“retrieveUsageUri”).
makes Azure Functions my preferred choice in this instance. In an enterprise implementation, a service principal and Azure
The Timer Trigger has an obvious function, but in order for Active Directory integration would be a more realistic choice to
DailyEABatchControl to know what to process, it retrieves con- achieve a higher degree of security.
figuration information from the Enrollments collection, which
has the following schema: Figure 3 Job Control Logic
{ // Get list of enrollments for daily processing
"enrollmentNumber": "<enrollment number>", List<Enrollment> enrollments =
"description": "", inputDocument.CreateDocumentQuery<Enrollment>(
"accessKey": "<access key>", UriFactory.CreateDocumentCollectionUri(dbName, enrollmentCollection),
"detailedEnabled": "true", new SqlQuerySpec("SELECT * FROM c WHERE c.detailedEnabled = 'true'"),
"summaryEnabled": "false", queryOptions).ToList<Enrollment>();
}
For now, having the enrollment number, access key and a flag // Get yesterday's date to make sure there are logs for today
int comparisonEpoch =
to turn on processing (“detailedEnabled”) is sufficient for me to (int)(DateTime.UtcNow.AddDays(-1) - new DateTime(1970, 1, 1)).TotalSeconds;
do work. However, should I start adding capabilities and need
string logQuery =
additional run configuration information, Azure Cosmos DB "SELECT * FROM c WHERE c.epoch > '" + comparisonEpoch.ToString() + "'";
will allow me to easily add elements to the document schema
List<JobLog> logs = inputDocument.CreateDocumentQuery<JobLog>(
without having to do a bunch of reworking and data migration. UriFactory.CreateDocumentCollectionUri(dbName, jobLogCollection),
Once the DailyEABatchControl is triggered, it will loop through new SqlQuerySpec(logQuery), queryOptions).ToList<JobLog>();

all of the documents and call RetrieveUsage for each enrollment // Get list of enrollments for which there is no match
that has “detailedEnabled” set to true, separating the logic to start var jobList = enrollments.Where(x =>
!logs.Any (l => l.enrollmentNumber == x.enrollmentNumber));
a job from the logic to retrieve the source data. I use the JobLog

46 msdn magazine Azure

0817msdn_FultzBatch_v3_44-52.indd 46 7/12/17 11:55 AM


[ WPF ]
[ Windows Forms ]
[ Free Gauges ]
[ Data Visualization ]
[ Volume Rendering ]
[ 3D / 2D Charts ] [ Maps ]

LightningChart®
The fastest and most advanced
charting components

Create eye-catching and


powerful charting applications
for engineering, science
and trading

• DirectX GPU-accelerated
• Optimized for real-time monitoring
• Supports gigantic datasets
• Full mouse-interaction
• Outstanding technical support
• Hundreds of code examples

NEW
• Now with Volume Rendering extension
• Flexible licensing options

Get free trial at


LightningChart.com/ms

Untitled-12 1 3/31/17 5:16 PM


Figure 4 Retrieving Usage Data Splitting Data for Parallel Processing
foreach(var doc in jobList)
With so much data coming in and the work of somehow updating
{ records for a given month of processing each day, it’s important to
HttpClient httpClient = new HttpClient();
process this data in a parallel fashion. Usually, at least nowadays,
string retrieveUsageUri = @"https://" + this is when I break out the parallel libraries for C#, write a few
System.Environment.GetEnvironmentVariable("retrieveUsageUri");
lines of code and pat myself on the back for being a genius at par-
string postBody = "{\"enrollment\":\"" + doc.enrollmentNumber + "\"," + allel processing. However, in this instance, I’d really like to just rely
"\"month\":\"" + DateTime.Now.ToString("yyyy-MM") + "\"}";
on the capabilities of the platform to do that for me and allow me
httpClient.DefaultRequestHeaders.Accept.Add( to focus on each discrete task.
new MediaTypeWithQualityHeaderValue("application/json"));
The next Azure Function in the sequence has been configured
var content = new StringContent(postBody, Encoding.UTF8, "application/json"); with a blob trigger so it will pick up files that land in the inbound
var response = await httpClient.PostAsync(theUri, content);
processing storage container. The job at this step is to split the
response.EnsureSuccessStatusCode(); inbound file into a file-per-day per enrollment. All in all, this is a
string fetchResult = await response.Content.ReadAsStringAsync();
pretty simple step, but it does require deserializing the JSON file
} into RAM. It’s important to note this, because the method I’ve
chosen to use for the prototype simply calls the deserialize method:
JsonConvert.DeserializeObject<List<EAUsageDetail>>(myBlob);
The last step of the first leg of my data’s journey is within the
I know this to be sufficient for my purposes, but the present
RetrieveUsage function, where it’s persisted to the newdailyusage
RAM allocation for the Azure Function host is 1.5GB. It’s possible
container with Azure Blob Storage. However, in order to get that
that, for a large enrollment with substantial resources provisioned,
data I have to construct the call and include the accessKey as a
a file would become too big at some point in the month to load
bearer token in the header:
HttpClient httpClient = new HttpClient();
into RAM, in which case an alternate method for parsing and split-
ting the file will have to be used. Moreover, if you create an Azure
string retrieveUsageUri = usageQB.FullEAReportUrl();
Function that takes more than five minutes to run, it will have to
httpClient.DefaultRequestHeaders.Add("authorization", bearerTokenHeader); be modified because the current default is five minutes, though
httpClient.DefaultRequestHeaders.Add("api-version", "1.0");
this can be adjusted to a max of 10 minutes via the host configura-
var response = await httpClient.GetAsync(retrieveUsageUri); tion JSON. As I mentioned early on, knowing the volume of data
response.EnsureSuccessStatusCode();
will be key at each point and for integration in the overall system.
Once the data has been deserialized, I’ll grab the max day out of
string responseText = await response.Content.ReadAsStringAsync();
it and set up a loop from day one to day max to start selecting out
For the sake of brevity, I’ve cut some date manipulations out of the data for each of those days, as shown in Figure 5.
this code block and haven’t included a helper class for generating the Once all the days have been split into separate files and written
bearerTokenHeader or the UsageReportQueryBuilder. However, out (see step 7 in Figure 2), I simply move the file to the processed­
this should be sufficient to illustrate how they’re used and ordered.
The accessKey is passed into the static method FromJwt, which will Figure 5 Selecting Each Day’s Data
return the BearerToken type, from which I simply grab the header // Loop through collection filtering by day
and add it to the request that’s created from the URL constructed for(int dayToProcess = 1; dayToProcess <= maxDayOfMonth; dayToProcess++)
{
by the call to usageQB.FullEAReportUrl. Last, I update the out- // Get documents for current processing day
put binding to the path and filename I want for the Blob target: var docsForCurrentDay = results.Where (d => d.Day==dayToProcess);
path = "newdailyusage/" + workingDate.ToString("yyyyMMdd")
// Serialize to string
+ "-" + data.enrollment + "-usage.json";
string jsonForCurrentDay =
var attributes = new Attribute[]
JsonConvert.SerializeObject(docsForCurrentDay);
{
log.Info($"***** Docs for day {dayToProcess} *****");
new BlobAttribute(path),
new StorageAccountAttribute("eabillingstorage_STORAGE")
// Get date for one of the records for today
};
string processDateString = (from docs in results where docs.Day ==
dayToProcess select docs.Date).First();
using (var writer = await binder.BindAsync<TextWriter>(attributes))
{
path = "newdailysplit/" + DateTime.Parse(processDateString).ToString("yyyyMMdd")
writer.Write(responseText);
+ "-" + enrollment + "-dailysplit.json";
}
This will result in a structure in Azure Storage that looks like this: // Write out each day's data to file in container "\newdailysplit"
newdailyusage/ var attributes = new Attribute[]
20170508-1234-usage.json {
20170508-456-usage.json new BlobAttribute(path),
20170507-123-usage.json new StorageAccountAttribute("eabillingstorage_STORAGE")
};
This allows me to store data multiple enrollments and multiple
files for each enrollment in case processing doesn’t happen for some using (var writer = await binder.BindAsync<TextWriter>(attributes))
{
reason. Additionally, because data can change for previous days as writer.Write(jsonForCurrentDay);
the month progresses, it’s important to have the files available for }
}
research and reconciliation in case anomalies show up in the report data.
48 msdn magazine Azure

0817msdn_FultzBatch_v3_44-52.indd 48 7/12/17 11:55 AM


magazine VP MSDN MAGAZINE VENDOR PROFILE
MSDN Magazine Vendor Profile

We Are Changing the Way


You Look at Reporting
A Reporting Q&A with Bjoern Meyer, Text Control
Founded in 1991, Text Control is an
award-winning Visual Studio Industry Partner
and leading vendor of word processing and
reporting components for Windows, web and
mobile development technologies.

Q What is Text Control doing?


A Since 25 years, our products and technologies help
thousands of developers add comprehensive
reporting and word processing functionality to their
applications. Our mission is to use everyday innova-
tion to uncover our user’s real reporting requirements.

Digital transformation changed every process in


today’s business world. The number of e-commerce
transactions skyrocket and supply chains are fully
connected. In nearly any business process, documents
and reports need to be designed, created, shared and
archived. Our technologies help companies to
integrate document processing to client, web and cloud solutions Q What sets Text Control Reporting apart from other
to gain the largest competitive advantage. reporting vendors?

We have been developing software components for reporting A Text Control Reporting is based on the powerful word
and document processing for more than 25 years. We are processing component TX Text Control. The MS Word compatible
continually looking for new and innovative ways to improve template can be merged with a data object (business object) or
document processing to make these processes easier for database content with one line of code. At the same time, Text
end-users and more efficient. Control provides a powerful API to customize this merge process
completely. The report generation can be fully integrated into
Q What is the Text Control Reporting Framework? .NET applications.

A The Text Control Reporting Framework combines powerful


Q Tell us more about your new Cloud reporting Web API
reporting features with an easy-to-use, MS Word compatible
word processor. Users can create documents and templates using A Text Control ReportingCloud brings complete reporting
ordinary Microsoft Word skills. It is completely independent from functionality to the cloud so all developers can use it, irrespective
MS Word or any other third-party application and can be of the platform or language they’re using. Its highly RESTful API
completely integrated into business applications. The Text Control can be used to merge Microsoft Word compatible templates with
Reporting Framework is included in all .NET based TX Text JSON data from all clients including .NET, JavaScript, PHP, Node.JS,
Control products including ASP.NET, Windows Forms and WPF. jQuery, Ruby, Python, Android, Java and iOS.

For more information, visit g www.textcontrol.com

Untitled-3 1 7/7/17 12:31 PM


initial provisioning. I know there will
be 31 concurrent executions, but it’s a
little harder to nail down how many
concurrent requests per second that
will create without doing repetitive
runs. The end result of this proto-
type will help to inform the final
architecture and requirements for
provisioning, but because I’m work-
ing forward on this timeline, I’m
Figure 6 Azure Cosmos DB Pricing Calculator
going to take a stab at it using the
usage container. To keep the diagram in Figure 2 easy to parse, I’ve following as my rules for estimating:
omitted some containers—in particular, the error files container • 1 ,200 records
is missing from the diagram. This is the container that holds any • 3 1 concurrent executions (for a single EA)
file that causes an exception during processing, whether that file • 0 .124 seconds per request (empirical evidence from
is the entire usage file or just one of the daily splits. I don’t spend measuring a few individual requests)
time or effort correcting the data for missing or errored days I’ll round down to 0.1 seconds for a more conservative estimate,
because, once an issue is identified, the process can be triggered for thus overestimating the load. This nets 310 requests per second per
a given month and enrollment or for a single daily split to correct EA, which in turn comes out to about 7,800 request units (RUs)
the problem. Also clearly missing from the prototype are alerting based on the calculator results, as can be seen in Figure 6.
and compensating mechanisms for when errors occur, but that’s Because the maximum RUs that can be provisioned without
something I want to bubble up through Application Insights calling support is 10,000, this might seem kind of high. However,
integration with the Operations Management Suite. I’m running an unthrottled parallel process and that drives up the
throughput significantly, which in turn will drive up the cost. This
Persisting the Data to Azure Cosmos DB is a major consideration when designing the structure because it’s
With the files split and ready to be picked up by the ProcessDaily­ fine for me to run this for some testing, but for the real solution
Usage Function, it’s time to consider some issues that need to be I’ll need a throttling mechanism to slow down the processing so
addressed, namely throughput to the target and how to handle I can provision fewer RUs and save myself a little money. I don’t
updates. Often when working through some solution architecture need the data to be captured as fast as possible, just within a
in an enterprise, you run into older systems that are less capable, reasonable enough time that someone could review and consume
or where real-time loads and high-throughput scenarios need to it on a daily basis. The good news is that the Azure Functions team
be managed. I don’t naturally have any hard has a concurrency control mechanism in
throughput constraints in my cloud native the backlog of issues that will eventually get
setup for this architecture, but I could create resolved (bit.ly/2tcpAbI), and will provide a
problems for myself if I don’t take the time good means of control once implemented.
to think through the volume and speed of Some other options are to introduce artifi-
the data I’m feeding into the cloud services cial arbitrary delays (let’s all agree this is bad)
I’m consuming. or to rework the processing and handle the
For my data set, each of the daily splits parallel execution explicitly in the C# code.
is about 2.4MB and contains about 1,200 Also, as technical expert Fabio Cavalcante
individual documents. Keep in mind that pointed out in a conversation, another good
each document represents one meter read- option would be to modify the architecture
ing for one resource provisioned in Azure. a bit by adding Azure Storage Queues and
Thus, for each EA the number of documents using features such as visibility timeouts and
in a daily split could vary greatly depending scheduled delivery to act as a throttling mech-
on resource usage across the enterprise. The anism. That would add a few moving parts
ProcessDailyUsage Function is configured to to the system and I’d have to work out the
trigger based on receiving new blobs in the interaction of using a queue for activation
newdailysplit container. This means I’ll have while keeping the data in storage, or slice up
as many as 31 concurrent Function executions the data in 64KB blocks for the queue. Once
manipulating the data. To help me estimate throttling is available in Azure Functions,
what I need to provision for Azure Cosmos I’ll be able to keep it in this simpler form
DB, I used the calculator at documentdb.com/­ with which I’m working. The salient point
capacityplanner . Without some empirical here is that when working with a serverless
testing I had to make a few guesses for the Figure 7 Provisioning a New Collection architecture you must be familiar with the

50 msdn magazine Azure

0817msdn_FultzBatch_v3_44-52.indd 50 7/12/17 11:55 AM


Untitled-1 1 7/5/17 10:28 AM
Figure 8 Using FeedOptions to Set the Cross-Partition Query Flag records and then add the new—and potentially updated—documents
string docsToDeleteQuery = String.Format(@"SELECT * FROM c where c.Enrollment =
back in. To do this I need to pass in the partition key to the Delete­
""{0}"" AND c.Date = ""{1}""", enrollment, incomingDataDate); DocumentAsync method. An optimization would be to pull the
FeedOptions queryOptions = new FeedOptions { MaxItemCount = -1, documents back and do a local comparison, update any changed
EnableCrossPartitionQuery = true }; documents and add net new documents. It’s a little taxing, because
IQueryable<Document> deleteQuery = docDBClient. all of the elements in each document must be compared. Because
CreateDocumentQuery<Document>(
UriFactory.CreateDocumentCollectionUri(dbName, collectionName),
there’s no primary key defined for the billing documents, you can
new SqlQuerySpec(docsToDeleteQuery), queryOptions); likely find the matched document using SubscriptionId, MeterId,
log.Info("Delete documents"); InstanceId and Date and compare the rest of the elements from
int deletedDocumentCount = 0; there. This would offload some of the work from Azure Cosmos
foreach (Document doc in deleteQuery)
{ DB and reduce the overall traffic.
await docDBClient.DeleteDocumentAsync(((dynamic)doc)._self, With the way cleared to add the documents back into the collection,
new RequestOptions { PartitionKey =
new PartitionKey(((dynamic)doc).SubscriptionId) }); I simply loop through the docs and call AddAsync on the document-
deletedDocumentCount++; Collector I defined as the output binding for the Azure Function:
}
// Update the enrollment field in the incomming collection
incomingDailyUsage.ForEach (usage => usage.Enrollment = enrollment);

constraints of the platforms on which you’re building, as well as int processedRecordCount=0;


foreach (EnrollmentUsageDetail usageDoc in incomingDailyUsage)
the cost of each decision. {
When provisioning more than 2,500 RUs, the system requires
await documentCollector.AddAsync(usageDoc);
that a partition key be specified. This works for me, because I processedRecordCount++;
want to partition that data in any case to help with both scale and }
security in the future. While it’s not much of a change, I’ve also done a little bit of
As you can see in Figure 7, I’ve specified 8,000 RUs, which enrichment by adding the Enrollment number to each document
is a little more than the calculation indicated, and I’ve specified in the collection. Running one daily split file produces the log
SubscriptionId as the partition key. information shown in Figure 9.
Additionally, I set up the ProcessDailyUsage with a blob trigger
on the newdailysplit container and with an input and output bind- Final Note
ing for Azure Cosmos DB. The input binding is used to find the The only thing left to do is to run a good many iterations with
records that exist for the given day and enrollment and to handle varying inputs and then measure so I can properly size the ser-
duplicates. I’ll ensure that my FeedOptions sets the cross-partition vices I’m using. This includes testing out the geographic replication
query flag, as shown in Figure 8. capabilities and some further prototyping of the security that I’ll
I create a query to grab all the records for the enrollment on want to implement around subscription data access; these were
that date and then loop through and delete them. This is one two of the major reasons for choosing Azure Cosmos DB. The net
instance where SQL Azure could’ve made things easier by issuing lessons to be gleaned are some of the ones that we seem to keep
a DELETE query or by using an upsert with a known primary key. learning in the world of IT:
However, in Azure Cosmos DB, to do the upsert I need the row ID, 1. There are no magic bullets, not even with a serverless architecture.
which means I must make the round trip and do the comparison 2. N othing replaces thorough testing.
on fields I know to uniquely identify the document and then use 3. Size your dependent services and treat this as seriously as
that row’s id or selflink. For this example, I simply delete all the you did when sizing your hardware in the past.
Figure 9 The Log Information from a Daily Split File
4. Pay close attention to cost, especially under high through-
put conditions.
2017-06-10T01:16:55.291 Function started (Id=bfb220aa-97ab-4d36-9c1e-602763b93ff0) The upside of using serverless compute like Azure Functions is
2017-06-10T01:16:56.041 First 15 chars: [{"AccountOwner
2017-06-10T01:16:56.181 get date that you pay only for what’s consumed. For regular but infrequent
2017-06-10T01:16:56.181 getting enrollment processing such as this, that can be a big benefit in cost savings.
2017-06-10T01:16:56.181 Incoming date: 11/01/2016 for Enrollment: 4944727
2017-06-10T01:16:56.181 Collection: partitionedusage Finally, configuring capabilities is a better experience and allows
2017-06-10T01:16:56.181 query: SELECT * FROM c where c.Enrollment = faster time to product than configuring host servers. n
"4944727" AND c.Date = "11/01/2016"
2017-06-10T01:16:56.181 Create delete query
2017-06-10T01:16:56.197 Delete documents
2017-06-10T01:17:23.189 2142 docs deleted while processing Joseph Fultz is a cloud solution architect at Microsoft. He works with Microsoft
20161101-4944727-dailysplit.json
2017-06-10T01:17:23.189 Import documents customers developing architectures for solving business problems leveraging
2017-06-10T01:17:44.628 2142 records imported from file Microsoft Azure. Formerly, Fultz was responsible for the development and archi-
20161101-4944727-dailysplit.json tecture of GM’s car-sharing program (mavendrive.com). Contact him on Twitter:
2017-06-10T01:17:44.628 Moving file 20161101-4944727-dailysplit.json to / @JosephRFultz or via e-mail at [email protected].
processedusage container
2017-06-10T01:17:44.674 Deleting 20161101-4944727-dailysplit.json
2017-06-10T01:17:44.690 Completed!
2017-06-10T01:17:44.690 Function completed (Success, Id=bfb220aa-97ab-
4d36-9c1e-602763b93ff0, Duration=49397ms)
Thanks to the following Microsoft technical expert who reviewed this article:
Fabio Calvacante
52 msdn magazine Azure

0817msdn_FultzBatch_v3_44-52.indd 52 7/12/17 11:55 AM


4 LOCATIONS
TO CHOOSE FROM
JOIN US

LAST CHANCE!

Redmond
August 14-18

See pages 70-71

Chicago
September 18-21

See pages 54-55

Anaheim
October 16-19

See page 56-57

Orlando
November 12-17

12-17
See pages 76-79

CONNECT WITH US
vslive.com
twitter.com/vslive – facebook.com – linkedin.com – Join the
@VSLive Search “VSLive” “Visual Studio Live” group!
TURN THE PAGE FOR MORE EVENT DETAILS £
Untitled-5 1 7/12/17 6:34 PM
INTENSE TRAINING FOR DEVELOPERS, ENGINEERS,
PROGRAMMERS, ARCHITECTS AND MORE!
Development Topics Include:
³VisualStudio / .NET Framework ³ASP.NET / Web Server
³JavaScript / HTML5 Client ³Agile

³Native Client ³ALM / DevOps


³Software Practices ³Cloud Computing

³Database and Analytics ³Windows Client

³Angular JS ³Xamarin

REGISTER
Register By August 18 and Save $200!
Use promo code VSLCH5
NOW

GOLD SPONSOR SUPPORTED BY PRODUCED BY

magazine

Untitled-5 2 7/12/17 6:10 PM


CHICAGO AGENDA AT-A-GLANCE
ALM / Cloud Database and Native Software Visual Studio / Web Client Web Server
DevOps Computing Analytics Client Practices .NET Framework

START TIME END TIME Visual Studio Live! Pre-Conference Workshops: Monday, September 18, 2017 (Separate entry fee required)
7:30 AM 9:00 AM Pre-Conference Workshop Registration - Coffee and Morning Pastries

M01 Workshop: Distributed Cross-Platform


M02 Workshop: Practical ASP.NET DevOps M03 Workshop: SQL Server 2016 for
9:00 AM 6:00 PM Application Architecture
with VSTS or TFS - Brian Randell Developers - Andrew Brust & Leonard Lobel
- Jason Bock & Rockford Lhotka

6:45 PM 9:00 PM Dine-A-Round

START TIME END TIME Visual Studio Live! Day 1: Tuesday, September 19, 2017
7:00 AM 8:00 AM Registration - Coffee and Morning Pastries

8:00 AM 9:00 AM KEYNOTE: To Be Announced


T01 JavaScript for the C# Developer T03 Build Cross-Platform Apps in C# T04 What’s New in Visual Studio
9:15 AM 10:30 AM T02 Storyboarding 101 - Billy Hollis
- Philip Japikse using CSLA .NET - Rockford Lhotka 2017 - Robert Green

T06 Better, Faster, Automated!


T05 TypeScript: The Future of Front
Windows App Deployment in Visual T07 Roll Your Own Dashboard in T08 What’s New in C#7
10:45 AM 12:00 PM End Web Development
Studio Mobile Center - Ela Malani & XAML - Billy Hollis - Jason Bock
- Ben Hoelting
Piyush Joshi

12:00 PM 1:00 PM Lunch - Visit Exhibitors

1:00 PM 1:30 PM Dessert Break - Visit Exhibitors

T10 Professional Scrum


T09 ASP.NET Core MVC - What You T11 What’s New for Developers in
1:30 PM 2:45 PM Development Using Visual Studio T12 To Be Announced
Need to Know - Philip Japikse SQL Server 2016 - Leonard Lobel
2017 - Richard Hundhausen

T13 User Authentication for


T14 PowerShell for Developers T15 What’s New in Azure IaaS v2
3:00 PM 4:15 PM ASP.NET Core MVC Applications T16 To Be Announced
- Brian Randel - Eric D. Boyd
- Brock Allen

4:15 PM 5:30 PM Welcome Reception

START TIME END TIME Visual Studio Live! Day 2: Wednesday, September 20, 2017
7:30 AM 8:00 AM Registration - Coffee and Morning Pastries

W02 Professional Software W04 Database Development


W01 Securing Web APIs in ASP.NET W03 Cloud Oriented Programming
8:00 AM 9:15 AM Testing Using Visual Studio 2017 with SQL Server Data Tools
Core - Brock Allen - Vishwas Lele
- Richard Hundhausen - Leonard Lobel

W05 Assembling the Web - A Tour W06 Get Started with Git and W07 Building Modern Web Apps W08 Tactical DevOps for SQL Server
9:30 AM 10:45 AM
of WebAssembly - Jason Bock GitHub - Robert Green with Azure - Eric D. Boyd - Brian Randell

11:00 AM 12:00 PM General Session: To Be Announced

12:00 PM 1:00 PM Birds-of-a-Feather Lunch - Visit Exhibitors

1:00 PM 1:30 PM 'HVVHUW%UHDN9LVLW([KLELWRUV([KLELWRU5DIÁH#SP 0XVWEHSUHVHQWWRZLQ

W09 Tools for Modern W10 Go Mobile with C#, W12 Power BI: Analytics for
W11 Build Awesome AF Apps!
1:30 PM 2:45 PM Web Development Dev Ops Visual Studio, and Xamarin Desktop, Mobile and Cloud
- Rachel Appel
- Ben Hoelting - James Montemagno - Andrew Brust

W13 Get Rid of HTML Tables for W14 Continuous Integration & W15 Microservices with Azure
W16 Power BI: Beyond the Basics
3:00 PM 4:15 PM Better Mobile Web Applications Deployment for Mobile Apps Container Service & Service Fabric
- Andrew Brust
- Paul Sheriff - James Montemagno - Vishwas Lele

W18 Mobilizing your Existing W19 Busy Developer’s Guide to the


W17 Use HTML5/Bootstrap to Build W20 Big Data Solutions in Azure
4:30 PM 5:45 PM Enterprise Applications Google Cloud Platform
Business UI - Paul Sheriff - David Giard
- Nick Landry - Ted Neward

7:00 PM 9:00 PM Visual Studio Live! Evening Event

START TIME END TIME Visual Studio Live! Day 3: Thursday, September 21, 2017
7:30 AM 8:00 AM Web Client

TH02 PowerApps, Flow, and


TH01 I Just Met You, and “This” Common Data Service: Empowering TH03 Lessons Learned from Real
TH04 Busy Developer’s Guide to
8:00 AM 9:15 AM is Crazy, But Here’s My NaN, So Businesses with the Microsoft World Xamarin.Forms Projects
NoSQL - Ted Neward
&DOO 0H 0D\EH"- Rachel Appel Business Application Platform - Nick Landry
- Archana Nair

TH06 PowerApps and Flow Part II:


TH05 Build Object-Oriented TH07 Improve Your Retrospective TH08 Building Applications with
Package, Embed, and Extend Your
9:30 AM 10:45 AM Enterprise Apps in JavaScript with Outcomes with Agile Kaizen DocumentDb - New Features and
Applications - Anousha Mesbah &
TypeScript - Rachel Appel - Angela Dugan Best Practices - Raj Krishnan
Pratap Ladhani

TH09:KDW·V1HZLQ7\SH6FULSW" TH10 Getting Started with Entity TH11 Open Source for Microsoft TH12 Introduction to Machine
11:00 AM 12:15 PM
- Doris Chen Framework Core - Jim Wooley Developers - Rockford Lhotka Learning with R - Raj Krishnan

12:15 PM 1:15 PM Lunch

TH14 Getting Your Agile Team


TH13 Practical Performance Tips TH16 The Rise of the Machines -
Unstuck! Tips and Tricks for Blasting TH15',:K\"*HWWLQJD*ULSRQ
1:15 PM 2:30 PM and Tricks to Make Your HTML/ Machine Learning for Developers
Through Common Setbacks Dependency Injection - Jeremy Clark
JavaScript Faster - Doris Chen - Adam Tuliper
- Angela Dugan

TH19 Unit Testing Makes Me Faster: TH20 I’m Emotional - Using


TH17 Building Powerful
TH18 Improving Code Quality with Convincing Your Boss, Your Co- Microsoft Cognitive Services to
2:45 PM 4:00 PM Applications with AngularJS 2 and
Roslyn Code Analyzers - Jim Wooley Workers, and Yourself Understand the World Around You
TypeScript - David Giard
- Jeremy Clark - Adam Tuliper
Speakers and sessions subject to change

CONNECT WITH US
vslive.com/chicagomsdn
twitter.com/vslive – facebook.com – linkedin.com – Join the
@VSLive Search “VSLive” “Visual Studio Live” group!

Untitled-5 3 7/12/17 6:11 PM


ANAHEIM, CA
OCT 16-19, 2017
HYATT REGENCY
A Disneyland® Good Neighbor Hotel

INTENSE TRAINING FOR DEVELOPERS, ENGINEERS,


PROGRAMMERS, ARCHITECTS AND MORE!
Development Topics include:
³Visual Studio / .NET ³ASP.NET Core
³JavaScript / HTML5 ³Web API
³Angular ³ALM / DevOps
³Native Mobile & Xamarin ³Cloud Computing
³Software Practices ³UWP

³Database and Analytics ³Unit Testing

REGISTER
Register by August 25 and Save $300!
Use promo code VSLAN5
NOW

EVENT PARTNER SUPPORTED BY PRODUCED BY

magazine

Untitled-5 4 7/12/17 6:11 PM


ANAHEIM AGENDA AT-A-GLANCE
ALM / Cloud Database and Native Software Visual Studio / Web Client Web Server
DevOps Computing Analytics Client Practices .NET Framework

START TIME END TIME Visual Studio Live! Pre-Conference Workshops: Monday, October 16, 2017 (Separate entry fee required)
7:30 AM 9:00 AM Pre-Conference Workshop Registration - Coffee and Morning Pastries

M01 Workshop: Distributed Cross-Platform


M02 Workshop: Practical ASP.NET DevOps M03 Workshop: Developer Dive into
9:00 AM 6:00 PM Application Architecture
with VSTS or TFS - Brian Randell SQL Server 2016 - Leonard Lobel
- Jason Bock & Rockford Lhotka

6:45 PM 9:00 PM Dine-A-Round

START TIME END TIME Visual Studio Live! Day 1: Tuesday, October 17, 2017
7:00 AM 8:00 AM Registration - Coffee and Morning Pastries

8:00 AM 9:00 AM KEYNOTE: To Be Announced

T02 Go Mobile with C#,


T01:KDW·V1HZLQ7\SH6FULSW" T04 What’s New in Visual Studio
9:15 AM 10:30 AM Visual Studio, and Xamarin T03 To Be Announced
- Doris Chen 2017 - Robert Green
- James Montemagno

T05 Build Object-Oriented T06 Optimizing and Extending


T07 What’s New for Developers in
10:45 AM 12:00 PM Enterprise Apps in JavaScript with Xamarin.Forms Mobile Apps T08 To Be Announced
SQL Server 2016 - Leonard Lobel
TypeScript - Rachel Appel - James Montemagno

12:00 PM 1:30 PM Lunch

T11 Exploring T-SQL Enhancements:


T09 Angular 101: Part 1 T10 Get Started with Git and T12 Microsoft Teams - More Than
1:30 PM 2:45 PM Windowing and More
- Deborah Kurata GitHub - Robert Green Just Chat! - Nedra Allmond
- Leonard Lobel

T14 Do It Again, Faster! Automate


T13 Angular 101: Part 2 T15 What’s New in Azure IaaS v2 T16 Open Source for the Microsoft
3:00 PM 4:15 PM Your Windows Deployment Pipeline
- Deborah Kurata - Eric D. Boyd Developer - Rockford Lhotka
- Ela Malani

4:15 PM 5:30 PM Welcome Reception

START TIME END TIME Visual Studio Live! Day 2: Wednesday, October 18, 2017
7:00 AM 8:00 AM Registration - Coffee and Morning Pastries

W01 I Just Met You, and “This”


W02 Tactical DevOps with VSTS W03 Go Serverless with Azure W04 What’s New in C#7
8:00 AM 9:15 AM is Crazy, But Here’s My NaN, So
- Brian Randell Functions - Eric D. Boyd - Jason Bock
&DOO 0H 0D\EH"- Rachel Appel

W08 I’ll Get Back to You:


W05 Practical Performance Tips
W06 Real World VSTS Usage for the W07 Cloud Oriented Programming Understanding Task, Await,
9:30 AM 10:45 AM and Tricks to Make Your HTML/
Enterprise - Jim Szubryt - Vishwas Lele and Asynchronous Methods
JavaScript Faster - Doris Chen
- Jeremy Clark

11:00 AM 12:00 PM General Session: To Be Announced

12:00 PM 1:30 PM Birds-of-a-Feather Lunch

W10 Database Lifecycle W11 Microservices with Azure


W09 Assembling the Web - A Tour W12 Getting Started with Entity
1:30 PM 2:45 PM Management and the SQL Server Container Service & Service Fabric
of WebAssembly - Jason Bock Framework Core - Jim Wooley
Database - Brian Randell - Vishwas Lele

W13 Building Single Page Web


W14 Getting to SAFe in the W15 Busy Developer’s Guide to the W16 Improving Code Quality with
3:00 PM 4:15 PM Applications Using Aurelia.js and the
Enterprise - Jim Szubryt Clouds - Ted Neward Roslyn Code Analyzers - Jim Wooley
MVVM Pattern - Ben Hoelting

W17 Building Applications with W18 Busy Developer’s Guide


W17 Securing Angular Apps W19/HDUQWR/RYH/DPEGDV DQG
4:30 PM 05:45 PM DocumentDb - New Features and to the Google Cloud Platform
- Brian Noyes /,147RR - Jeremy Clark
Best Practices - Raj Krishnan - Ted Neward

7:00 PM 9:00 PM Visual Studio Live! Evening Event

START TIME END TIME Visual Studio Live! Day 3: Thursday, October 19, 2017
7:30 AM 8:00 AM Registration - Coffee and Morning Pastries

TH02 Tools for Modern TH03 The Rise of the Machines -


TH01 ASP.NET Core MVC - What TH04 Storyboarding 101
8:00 AM 9:15 AM Web Development Dev Ops Machine Learning for developers
You Need to Know - Philip Japikse - Billy Hollis
- Ben Hoelting - Adam Tuliper

TH05 Role-Based Security


TH06 Building Cross-Platform Apps
Stinks: How to Implement Better TH07 Introduction to Machine TH08 Agile: You Keep Using
9:30 AM 10:45 AM in C# using CSLA .NET
Authorization in ASP.NET and ASP. Learning with R - Raj Krishnan That Word... - Philip Japikse
- Rockford Lhotka
NET Core - Benjamin Day

TH11 I’m Emotional - Using


TH10 Programming with the TH12 Top 10 Ways to Go from
TH09 From Zero to the Web API Microsoft Cognitive Services to
11:00 AM 12:15 PM Model-View-ViewModel Pattern Good to Great Scrum Master
- Paul Sheriff Understand the World Around You
- Miguel Castro - Benjamin Day
- Adam Tuliper

12:15 PM 1:15 PM Lunch

TH13 Cortana Everywhere: Speech, TH16 Exposing an Extensibility API


TH14 Roll Your Own Dashboard TH15 Unit Testing T-SQL Code
1:15 PM 2:30 PM Conversation & Skills Development for your Applications
in XAML - Billy Hollis - Steve Jones
- Nick Landry - Miguel Castro

TH18 Windows 10 for Developers:


TH17 Securing Web Apps and APIs TH19 A Tour of SQL Server Security TH20 Real World Applications for
2:45 PM 4:00 PM Building Universal Apps for 1B
with IdentityServer - Brian Noyes Features - Steve Jones Dependency Injection - Paul Sheriff
Devices - Nick Landry

Speakers and sessions subject to change

CONNECT WITH US vslive.com/anaheimmsdn


twitter.com/vslive – facebook.com – linkedin.com – Join the
@VSLive Search “VSLive” “Visual Studio Live” group!

Untitled-5 5 7/12/17 6:11 PM


Test Run JAMES MCCAFFREY

Deep Neural Network IO Using C#

Many of the recent advances in machine input hidden (3) output


learning (making predictions using data) .27 .35

have been realized using deep neural


networks. Examples include speech rec- .3627 .5628 .3269

ognition in Microsoft Cortana and Apple


.09 .31 .33
Siri, and the image recognition that helps .01
.21

enable self-driving automobiles. .10

1.0 .02 .3969 .17


The term deep neural network (DNN) .4711 .4649

is general and there are several specific .32 .34 .5823 .3333
variations, including recurrent neural
networks (RNNs) and convolutional 2.0 .4301 .4915 .20 .4801
neural networks (CNNs). The most
basic form of a DNN, which I explain in .08 .30
.16
.26 .37

this article, doesn’t have a special name,


so I’ll refer to it just as a DNN. .4621 .6017 .3398*

This article will introduce you to DNNs


so you’ll have a concrete demo program Figure 1 A Basic Deep Neural Network
to experiment with, which will help you
to understand literature on DNNs. I won’t present code that can be The DNN in Figure 1 has three hidden layers of processing
used directly in a production system, but the code can be extended nodes. The first hidden layer has four nodes, the second and third
to create such a system, as I’ll explain. Even if you never intend to hidden layers have two nodes. Each long arrow pointing from left
implement a DNN, you might find the explanation of how they to right represents a numeric constant called a weight. If nodes are
work interesting for its own sake. zero-base indexed with [0] at the top of the figure, then the weight
A DNN is best explained visually. Take a look at Figure 1. The connecting input[0] to hidden[0][0] (layer 0, node 0) has value 0.01
deep network has two input nodes, on the left, with values (1.0, and the weight connecting input[1] to hidden[0][3] (layer 0, node
2.0). There are three output nodes on the right, with values (0.3269, 3) has value 0.08 and so on. There are 26 node-node weight values.
0.3333, 0.3398). You can think of a DNN as a complex math func- Each of the eight hidden and three output nodes has a small
tion that typically accepts two or more numeric input values and arrow that represents a numeric constant called a bias. For example,
returns one or more numeric output values. hidden[2][0] has bias value of 0.33 and output[1] has a bias value of
The DNN shown might correspond to a problem where the goal 0.36. Not all of the weights and bias values are labeled in the dia-
is to predict the political party affiliation (Democrat, Republican, gram, but because the values are sequential between 0.01 and 0.37,
Other) of a person based on age and income, where the input values you can easily determine the value of a non-labeled weight or bias.
are scaled in some way. If Democrat is encoded as (1,0,0) and In the sections that follow, I explain how the DNN input-output
Republican is encoded as (0,1,0) and Other is encoded as (0,0,1), then mechanism works and show how to implement it. The demo pro-
the DNN in Figure 1 predicts Other for someone with age = 1.0 and gram is coded using C#, but you shouldn’t have too much trouble
income = 2.0 because the last output value (0.3398) is the largest. refactoring the code to another language, such as Python or
A regular neural network has a single hidden layer of processing JavaScript, if you wish to do so. The demo program is too long to
nodes. A DNN has two or more hidden layers and can handle very present in its entirety in this article, but the complete program is
difficult prediction problems. Specialized types of DNNs, such as available in the accompanying code download.
RNNs and CNNs, also have multiple layers of processing nodes,
but more complicated connection architectures, as well.
The Demo Program
Code download available at msdn.com/magazine/0717magcode.
A good way to see where this article is headed is to examine the screen-
shot of the demo program in Figure 2. The demo corresponds to the
58 msdn magazine

0817msdn_McCaffreyTRun_v4_58-64.indd 58 7/12/17 11:53 AM


Build More
with GrapeCity Developer Solutions

Visual Studio-Integrated UI Controls and Developer ActiveReports


Productivity Tools for Delivering Enterprise Apps
Powerful .NET reporting
Across All Platforms and Devices platform for essential
business needs

ComponentOne Visual Studio Spread Studio


Studio
Elegant, modular Powerful development Versatile .NET
.NET UI controls for tools engineered for spreadsheet data and
Visual Studio Visual Studio developers UI components

ComponentOne Studio Wijmo


for Xamarin JavaScript
Cross-platform grids, Fast, lightweight
charts, and UI controls true JavaScript controls
for native mobile written in TypeScript
devices

GrapeCity’s family of products provides developers, designers,


and architects with the ultimate collection of easy-to-use tools for Learn more and get free
building sleek, high-performing, feature-complete applications.
With over 25 years of experience, we understand your needs and
30-day trials at
offer the industry’s best support. Our team is your team.
For more information: 1.800.858.2739
tools.grapecity.com

© 2017 GrapeCity, Inc. All rights reserved. All other product and brand names are trademarks and/or registered trademarks of their respective holders.

Untitled-2 1 7/13/17 2:21 PM


An array named wts is instantiated with 37 cells and then the
values are set to 0.01 through 0.37. These values are inserted into
the DeepNet object using the SetWeights method. In a realistic,
non-demo DNN, the values of the weights and biases would be
determined using a set of data that has known input values and
known, correct output values. This is called training the network.
The most common training algorithm is called back-propagation.
The Main method of the demo program concludes with:
...
Console.WriteLine("Computing output for [1.0, 2.0] ");
double[] xValues = new double[] { 1.0, 2.0 };
dn.ComputeOutputs(xValues);
dn.Dump(false);
Console.WriteLine("End demo");
Console.ReadLine();
} // Main
} // Class Program
Method ComputeOutputs accepts an array of input values and
then uses the input-output mechanism, which I’ll explain shortly, to
calculate and store the values of the output nodes. The Dump helper
method displays the values of the 13 nodes, and the “false” argu-
ment means to not display the values of the 37 weights and biases.

Figure 2 Basic Deep Neural Network Demo Run The Input-Output Mechanism
The input-output mechanism for a DNN is best explained with a
DNN shown in Figure 1 and illustrates the input-output mechanism concrete example. The first step is to use the values in the input
by displaying the values of the 13 nodes in the network. The demo code nodes to calculate the values of the nodes in the first hidden layer.
that generated the output begins with the code shown in Figure 3. The value of the top-most hidden node in the first hidden layer is:
Notice that the demo program uses only plain C# with no tanh( (1.0)(0.01) + (2.0)(0.05) + 0.27 ) =
namespaces except for System. The DNN is created by passing the tanh(0.38) = 0.3627
number of nodes in each layer to a DeepNet program-defined class In words, “compute the sum of the products of each input node
constructor. The number of hidden layers, 3, is passed implicitly and its associated weight, add the bias value, then take the hyper-
as the number of items in the numHidden array. An alternative bolic tangent of the sum.” The hyperbolic tangent, abbreviated
design is to pass the number of hidden layers explicitly. tanh, is called the activation function. The tanh function accepts
The values of the 26 weights and the 11 biases are set like so: any value from negative infinity to positive infinity, and returns
int nw = DeepNet.NumWeights(numInput, numHidden, numOutput); a value between -1.0 and +1.0. Important alternative activation
Console.WriteLine("Setting weights and biases to 0.01 to " +
(nw/100.0).ToString("F2") ); functions include the logistic sigmoid and rectified linear (ReLU)
double[] wts = new double[nw]; functions, which are outside the scope of this article.
for (int i = 0; i < wts.Length; ++i)
wts[i] = (i + 1) * 0.01; The values of the nodes in the remaining hidden layers are cal-
dn.SetWeights(wts); culated in exactly the same way. For example, hidden[1][0] is:
The total number of weights and biases is calculated using a static tanh( (0.3627)(0.09) + (0.3969)(0.11) + (0.4301)(0.13) + (0.4621)
class method NumWeights. If you refer back to Figure 1, you can (0.15) + 0.31 ) =
see that because each node is connected to all nodes in the layer tanh(0.5115) = 0.4711
to the right, the number of weights is (2*4) + (4*2) + (2*2) + (2*3) And hidden[2][0] is:
= 8 + 8 + 4 + 6 = 26. Because there’s one bias for reach hidden and tanh( (0.4711)(0.17) + (0.4915)(0.19) + 0.33 ) =
output node, the total number of biases is 4 + 2 + 2 + 3 = 11. tanh(0.5035) = 0.4649
The values of the output nodes are calculated using a different
Figure 3 Beginning of Output-Generating Code activation function, called softmax. The preliminary, pre-activation
using System; sum-of-products plus bias step is the same:
namespace DeepNetInputOutput pre-activation output[0] =
{
class DeepInputOutputProgram (.4649)(0.21) + (0.4801)(0.24) + 0.35 =
{ 0.5628
static void Main(string[] args)
{ pre-activation output[1] =
Console.WriteLine("Begin deep net IO demo"); (.4649)(0.22) + (0.4801)(0.25) + 0.36 =
Console.WriteLine("Creating a 2-(4-2-2)-3 deep network");
int numInput = 2; 0.5823
int[] numHidden = new int[] { 4, 2, 2 };
int numOutput = 3; pre-activation output[2] =
DeepNet dn = new DeepNet(numInput, numHidden, numOutput); (.4649)(0.23) + (0.4801)(0.26) + 0.37 =
...
0.6017
60 msdn magazine Test Run

0817msdn_McCaffreyTRun_v4_58-64.indd 60 7/12/17 11:54 AM


Untitled-7 1 6/30/17 1:56 PM
The softmax of three arbitrary values, x, y, y is: Member ihWeights is an array-of-arrays-style matrix that holds
softmax(x) = e^x / (e^x + e^y + e^z) the input-to-first-hidden-layer weights. Member hoWeights is an
softmax(y) = e^y / (e^x + e^y + e^z) array-of-arrays-style matrix that holds the weights connecting the
softmax(z) = e^z / (e^x + e^y + e^z) last hidden layer nodes to the output nodes. Member hhWeights
where e is Euler’s number, approximately 2.718282. So, for the DNN is an array where each cell points to an array-of-arrays matrix that
in Figure 1, the final output values are: holds the hidden-to-hidden weights. For example, hhWeights[0][3]
output[0] = e^0.5628 / (e^0.5628 + e^0.5823 + e^0.6017) = 0.3269 [1] holds the weights connecting hidden node [3] in hidden layer
output[1] = e^0.5823 / (e^0.5628 + e^0.5823 + e^0.6017) = 0.3333 [0] to hidden node [1] in hidden layer [0+1].These data structures
output[2] = e^0.6017 / (e^0.5628 + e^0.5823 + e^0.6017) = 0.3398 are the heart of the DNN input-output mechanism and are a bit
The purpose of the softmax activation function is to coerce the out- tricky. A conceptual diagram of them is shown in Figure 4.
put values to sum to 1.0 so that they can be interpreted as probabilities The last two class members hold the hidden node biases and
and map to a categorical value. In this example, because the third the output node biases:
output value is the largest, whatever categorical value that was encoded public double[][] hBiases;
public double[] oBiases;
as (0,0,1) would be the predicted category for inputs = (1.0, 2.0).
As much as any software system I work with, DNNs have many
alternative data structure designs, and having a sketch of these data
Implementing a DeepNet Class structures is essential when writing input-output code.
To create the demo program, I launched Visual Studio and selected
the C# Console Application template and named it DeepNetInput-
Computing the Number of Weights and Biases
Output. I used Visual Studio 2015, but the demo has no significant
To set the weights and biases values, it’s necessary to know how many
.NET dependencies, so any version of Visual Studio will work.
weights and biases there are. The demo program implements the
After the template code loaded, in the Solution Explorer win-
static method NumWeights to calculate and return this number.
dow, I right-clicked on file Program.cs and renamed it to the more
Recall that the 2-(4-2-2)-3 demo network has (2*4) + (4*2) + (2*2) +
descriptive DeepNetInputOutputProgram.cs and allowed Visual
(2*3) = 26 weights and 4 + 2 + 2 + 3 = 11 biases. The key code in method
Studio to automatically rename class Program for me. At the top
NumWeights, which calculates the number of input-to-hidden,
of the editor window, I deleted all unnecessary using statements,
hidden-to-hidden and hidden-to-output weights is:
leaving just the one that references the System namespace. int ihWts = numInput * numHidden[0];
I implemented the demo DNN as a class named DeepNet. The int hhWts = 0;
for (int j = 0; j < numHidden.Length - 1; ++j) {
class definition begins with: int rows = numHidden[j];
public class DeepNet int cols = numHidden[j + 1];
{ hhWts += rows * cols;
public static Random rnd; }
public int nInput; int hoWts = numHidden[numHidden.Length - 1] * numOutput;
public int[] nHidden;
public int nOutput;
Instead of returning the total number of weights and biases as
public int nLayers; method NumWeights does, you might want to consider returning the
...
number of weights and biases separately, in a two-cell integer array.
All class members are declared with public scope for simplic-
ity. The static Random object member named rnd is used by the Setting Weights and Biases
DeepNet class to initialize weights and biases to small random values A non-demo DNN typically initializes all weights and biases to
(which are then overwritten with values 0.01 to 0.37). Members small random values. The demo program sets the 26 weights to
nInput and nOuput are the number of input and output nodes. 0.01 through 0.26, and the biases to 0.27 through 0.37 using class
Array member hHidden holds the number of nodes in each hid- method SetWeights. The definition begins with:
den layer, so the number of hidden layers is given by the Length public void SetWeights(double[] wts)
{
property of the array, which is stored into member nLayers for int nw = NumWeights(this.nInput, this.nHidden, this.nOutput);
convenience. The class definition continues: if (wts.Length != nw)
throw new Exception("Bad wts[] length in SetWeights()");
public double[] iNodes;
int ptr = 0;
public double [][] hNodes;
...
public double[] oNodes;
A deep neural network implementation has many design choices. ihWeights[][] [0] [1]
Array members iNodes and oNodes hold the input and output [0] hhWeights[][][]
values, as you’d expect. Array-of-arrays member hNodes holds [1]
the hidden node values. An alternative design is to store all nodes [0] [1] [2] [3]
in a single array-of-arrays structure nnNodes, where in the demo [0] [0]

nnNodes[0] is an array of input node values and nnNodes[4] is hoWeights[][] [1] [1]

an array of output node values. [0] [2] [0] [1]

The node-to-node weights are stored using these data structures: [1] [3]
public double[][] ihWeights; [0] [1] [2] [0] [1]
public double[][][] hhWeights;
public double[][] hoWeights; Figure 4 Weights and Biases Data Structures
62 msdn magazine Test Run

0817msdn_McCaffreyTRun_v4_58-64.indd 62 7/12/17 11:54 AM


Manipulating Files?
APIs to view, export, annotate, compare, sign,
automate and search documents in your
applications.

GroupDocs.Total

Try for Free

.NET Libraries Java Libraries Cloud APIs

Contact Us:
US: +1 903 306 1676 Visit us at www.groupdocs.com
EU: +44 141 628 8900
AU: +61 2 8006 6987
[email protected]

Untitled-8 1 1/5/17 2:06 PM


Input parameter wts holds the values for the weights and biases, The code is pretty much a one-one mapping of the mechanism
and is assumed to have the correct Length. Variable ptr points into described earlier. The built-in Math.Tanh is used for hidden node
the wts array. The demo program has very little error checking in activation. As I mentioned, important alternatives are the logistic
order to keep the main ideas as clear as possible. The input-to-first- sigmoid function and the rectified linear unit (ReLU) functions,
hidden-layer weights are set like so: which I’ll explain in a future article. Next, the remaining hidden-
for (int i = 0; i < nInput; ++i) layer nodes are calculated:
for (int j = 0; j < hNodes[0].Length; ++j)
for (int h = 1; h < nLayers; ++h) {
ihWeights[i][j] = wts[ptr++];
for (int j = 0; j < nHidden[h]; ++j) {
Next, the hidden-to-hidden weights are set: for (int jj = 0; jj < nHidden[h-1]; ++jj)
for (int h = 0; h < nLayers - 1; ++h) hNodes[h][j] += hhWeights[h-1][jj][j] * hNodes[h-1][jj];
for (int j = 0; j < nHidden[h]; ++j) // From hNodes[h][j] += hBiases[h][j];
for (int jj = 0; jj < nHidden[h+1]; ++jj) // To hNodes[h][j] = Math.Tanh(hNodes[h][j]);
hhWeights[h][j][jj] = wts[ptr++]; }
}
If you’re not accustomed to working with multi-dimensional
This is the trickiest part of the demo program, mostly due to the
arrays, the indexing can be quite tricky. A diagram of the weights
multiple array indexes required. Next, the pre-activation sum-of-
and biases data structures is essential (well, for me, anyway). The
products are calculated for the output nodes:
last-hidden-layer-to-output weights are set like this: for (int k = 0; k < nOutput; ++k) {
int hi = this.nLayers - 1; for (int j = 0; j < nHidden[nLayers - 1]; ++j)
for (int j = 0; j < this.nHidden[hi]; ++j) oNodes[k] += hoWeights[j][k] * hNodes[nLayers - 1][j];
for (int k = 0; k < this.nOutput; ++k) oNodes[k] += oBiases[k]; // Add bias
hoWeights[j][k] = wts[ptr++]; }
This code uses the fact that if there are nLayers hidden (3 in the Method ComputeOutputs concludes by applying the softmax activa-
demo), then the index of the last hidden layer is nLayers-1. Method tion function, returning the computed output values in a separate array:
SetWeights concludes by setting the hidden node biases and the ...
double[] retResult = Softmax(oNodes);
output node biases: for (int k = 0; k < nOutput; ++k)
... oNodes[k] = retResult[k];
for (int h = 0; h < nLayers; ++h) return retResult;
for (int j = 0; j < this.nHidden[h]; ++j) }
hBiases[h][j] = wts[ptr++];
The Softmax method is a static helper. See the accompanying code
for (int k = 0; k < nOutput; ++k) download for details. Notice that because softmax activation requires
oBiases[k] = wts[ptr++];
} all the values that will be activated (in the denominator term), it’s
more efficient to compute all softmax values at once instead of
Computing the Output Values separately. The final output values are stored into the output nodes
The definition of class method ComputeOutputs begins with: and are also returned separately for calling convenience.
public double[] ComputeOutputs(double[] xValues)
{ Wrapping Up
for (int i = 0; i < nInput; ++i)
iNodes[i] = xValues[i]; There has been enormous research activity and many breakthroughs
... related to deep neural networks over the past few years. Specialized
The input values are in array parameter xValues. Class member DNNs such as convolutional neural networks, recurrent neural
nInput holds the number of input nodes and is set in the class networks, LSTM neural networks and residual neural networks are
constructor. The first nInput values in xValues are copied into the very powerful but very complex. In my opinion, understanding
input nodes, so xValues is assumed to have at least nInput values how basic DNNs operate is essential for understanding the more
in the first cells. Next, the current values in the hidden and output complex variations.
nodes are zeroed-out: In a future article, I’ll explain in detail how to use the back-­
for (int h = 0; h < nLayers; ++h) propagation algorithm (arguably the most famous and important
for (int j = 0; j < nHidden[h]; ++j)
hNodes[h][j] = 0.0; algorithm in machine learning) to train a basic DNN. Back-
propagation, or at least some form of it, is used to train most DNN
for (int k = 0; k < nOutput; ++k)
oNodes[k] = 0.0; variations, too. This explanation will introduce the concept of
The idea here is that the sum of products term will be accumu- the vanishing gradient, which in turn will explain the design and
lated directly into the hidden and output nodes, so these nodes motivation of many of the DNNs now being used for very sophis-
must be explicitly reset to 0.0 for each method call. An alterna- ticated prediction systems. n
tive is to declare and use local arrays with names like hSums[][]
and oSums[]. Next, the values of the nodes in the first hidden Dr. James McCaffrey works for Microsoft Research in Redmond, Wash. He has
layer are calculated: worked on several Microsoft products, including Internet Explorer and Bing.
for (int j = 0; j < nHidden[0]; ++j) {
Dr. McCaffrey can be reached at [email protected].
for (int i = 0; i < nInput; ++i)
hNodes[0][j] += ihWeights[i][j] * iNodes[i];
hNodes[0][j] += hBiases[0][j]; // Add the bias Thanks to the following Microsoft technical experts who reviewed this article:
hNodes[0][j] = Math.Tanh(hNodes[0][j]); // Activation Li Deng, Pingjun Hu, Po-Sen Huang, Kirk Li, Alan Liu, Ricky Loynd, Baochen
} Sun, Henrik Turbell.

64 msdn magazine Test Run

0817msdn_McCaffreyTRun_v4_58-64.indd 64 7/12/17 11:54 AM


IN-DEPTH TRAINING FOR IT PROS

AUGUST 7 – 11, 2017


MICROSOFT HEADQUARTERS
REDMOND, WA

PLUG IN TO NEW KNOWLEDGE


@ THE SOURCE
WHAT SETS TECHMENTOR APART?
+ Immediately usable IT education
+ Training you need today, while preparing you for tomorrow
+ Zero marketing-speak, a strong emphasis on doing more with
the technology you already own, and solid coverage of what’s
just around the corner
+ Intimate setting, where your voice is heard, making it a viable
alternative to huge, first-party conferences
+ Experience life @ Microsoft Headquarters for a full week

YOU OWE IT TO YOURSELF, YOUR COMPANY AND


YOUR CAREER TO BE AT TECHMENTOR REDMOND 2017!

HOT TRAINING TOPICS INCLUDE:


+ Windows Server + Hyper-V + Windows PowerShell + DSC
+ DevOps + Azure + Security + And More! +

+++++++++++++++++++++++++++++++++++++++++
REGISTER NOW SAVE $300 THROUGH AUGUST 7
MUST USE DISCOUNT CODE TMEB01
TECHMENTOREVENTS.COM/REDMOND
EVENT SPONSOR: GOLD SPONSOR:
SUPPORTED BY: PRODUCED BY:

Untitled-2 1 6/6/17 10:37 AM


The Working Programmer TED NEWARD

How To Be MEAN: Up-Angular-izing

Welcome back again, MEANers. is named “@angular/common,” “@angular/compiler” and so on.


It’s been two years since I started this particular series on the For those of you on Windows, you get this slightly longer version:
MEAN (Mongo, Express, Angular, Node) stack. And, as was bound npm install @angular/common@latest @angular/compiler@latest @angular/
compiler-cli@latest @angular/core@latest @angular/forms@latest @angular/
to happen, various parts of the MEAN stack have evolved since the http@latest @angular/platform-browser@latest @angular/platform-browser-
series started. Most of these updates (specifically the Node, Express dynamic@latest @angular/platform-server@latest @angular/router@latest @
angular/animations@latest typescript@latest --save
and Mongo versions) are transparent, and adopting them is a non-
Once the “npm install” is finished executing, for all intents and
event: Just upgrade the underlying bits and everything works.
purposes the upgrade is done. Simply run the application using
But Angular upgrades have been of some concern to the Web
“ng serve” again, and everything should be back to running status.
front-end world for a while, particularly because the substantive
changes between AngularJS (v1) and Angular (v2 and beyond) cre-
ated some serious backward-compatibility issues. (I use the term
Angular 2-to-4 Pain Points
The Angular team has admitted that it’s not always a smooth transition—
“backward compatibility” loosely here, because the backward-
however, the release notes take care to point out that most of the
compatibility story for v1 to v2 was essentially, “Rewrite the whole
pain (apparently) is localized to the use of animations, which is a
thing—trust us, it’ll be great!”) Thus, it was with some amount of
subject I haven’t explored yet. Specifically, the team removed an-
consternation that the Angular world was watching for the first
imations from @angular/core, and dropped them into its own
major update to Angular, and when that update was announced
Node package, @angular/animations (which you can see in the
to be a major version enhancement, anxiety mounted.
previous “npm install” command). That way, if the application
Turns out, while we were busy writing the front end of the sample
doesn’t use animations, it doesn’t have to carry along the code of
application, the Angular team did what they were supposed to do—
animations in the core package.
release a new version of Angular into the world. That means it’s time
to take a moment, bite the bullet, and upgrade the application to the
new version of Angular: v4. (The Angular team decided to skip 3
and move straight to 4.) Spoiler alert: This turns out to be far, far
Fundamentally, upgrading to
less painful than people imagined it might be, and offers a lot of
hope regarding future Angular updates—which is good because the
Angular 4 means using the Node
Angular team has promised that they’re going to release cadence
much more in line with traditional open source projects. Which
Package Manager (npm) to
means, bluntly, a lot of small, incremental upgrades released much
more quickly (every 6 months) than what’s been the norm so far.
update the npm packages in use
Upgrading Angular to the latest versions.
Fundamentally, upgrading to Angular 4 means using the Node
Package Manager (npm) to update the npm packages in use to Angular 4 New Features
the latest versions. This takes the form of the too-familiar “npm The Angular 4 release notes carry the full weight of the story, but
install” command, using a version tag (“@latest”) for each pack- there are a few things in particular worth calling out.
age and the “--save" argument to capture the latest version into the First, the Angular team is focused on reducing the size/weight
application’s package.json file. For those running on a *nix system footprint of the Angular libraries. This is good for obvious reasons,
(Linux or macOS, typically), the command takes the following particularly for those users who aren’t on high-speed fiber connec-
form, all of which should be typed on one line: tions with the rest of the world. The Angular team says they’re not
npm install @angular/{common,compiler,compiler-
cli,core,forms,http,platform-browser,platform-browser-dynamic,platform-
done, either, so expect that each successive Angular release will
server,router,animations}@latest typescript@latest --save seek to decrease its footprint even further.
The *nix command shells allow for the various packages to be In the same spirit, the Angular team has reduced the overall
captured under the “{“/”}” pairs, even though technically each one size of the generated codebehind view templates, up to 60 percent.
66 msdn magazine

0817msdn_NewardWProg_v3_66-68.indd 66 7/12/17 11:56 AM


Untitled-1 1 7/14/17 11:11 AM
Again, this means that the application you build will be that much JSON data submitted, drivers for accessing databases and so on.
smaller and lighter. Rather, it differs in terms of what it doesn’t provide. That is to say,
Second, the team improved the “*ngIf ” and “*ngFor” directives the MEAN platform, building on top of the Node.js platform
used in view templates for branching and iteration scenarios, stresses a sense of “minimalism” that the .NET platform doesn’t.
respectively. You haven’t seen those yet, so the new features won’t That might sound like a slight to one or the other platform; that
be apparent yet, but you’ll see them soon, so hang in there. somehow Node.js isn’t “fully baked” or that .NET is “too heavy.” No
Last, the Angular team also brought the Angular libraries up to such value judgement is intended. But where .NET emerged from
the latest versions of TypeScript (2.2), which includes better nullable Microsoft and continues to be heavily driven by what the .NET
checking, some better type support for ECMAScript (ES) 2015-style Framework team has built over the years, the Node.js platform
mixings, and an “object” type to represent a type that’s the base has been bolted together by libraries built by hundreds of teams
type of all declared types in TypeScript, similar to the role that and thousands of developers from all across the world. There are
System.Object serves in much .NET code. This implicitly also brings pros and cons to each approach—but that’s not the direction I’m
support for TypeScript 2.1, which has some interesting features on headed with this.
its own, like the “keyof ” operator, mapped types (which provides The fact is both platforms are available to you, at your discretion.
the utility types Partial, Readonly, Pick and Record), and support And even just two years ago, the idea of Microsoft being a platform
for the “spread” and “rest” operators from ES 2015. All of this is by which developers could use either .NET or Node.js—or even Java
well beyond the scope of Angular itself, but any good TypeScript or PHP—for building applications on or near the Microsoft OS (or
tutorial (or the TypeScript Web site itself) will explain their use. cloud platform) seemed ludicrous. There were signs that suggested
Fundamentally, these won’t change the code that you write when that Microsoft might reach this kind of “all platforms created
writing Angular, at least not right away, but as these features get used equal” mentality, but the company’s history suggested we might
more in the Angular library, they might start finding their way into see an approach where .NET would be first among those equals.
the surface area of the Angular API. That likely won’t happen for Consider this for a moment: The “A” in the MEAN stack stands
a while, however, so for the moment, the biggest thing to keep in for Angular. When I began this series, Angular was not the power­
mind is that Angular is keeping up with the evolution of TypeScript. house, rich-client, single-page application (SPA) platform that it
is today—it was but one of several potential bets that you might
make on the JavaScript front-end landscape. Angular has seen a
In the same spirit, the Angular definite rise in interest, and the pages of this magazine have been
decorated with numerous references to Angular, both within the
team has reduced the confines of this column and in feature pieces written by others.
What’s remarkable is that this interest is in a front-end technology
overall size of the generated written in the open source world by a team that not only doesn’t
work for Microsoft, but works for one of Microsoft’s competitors.
codebehind view templates, Yet it uses the open source TypeScript language developed by
Microsoft. It’s enough to make your head spin.
up to 60 percent. The MEAN stack, and the coverage of MEAN in this magazine
in many ways articulate everything about “the new Microsoft.” It’s
Wrapping Up a stellar demonstration of how the Microsoft of 2017 is so entirely
Hopefully I’ve helped you understand that doing this upgrade costs different from the Microsoft of 2007 or 2000. The Microsoft that
you almost nothing to do—that’s the best kind of version update. valued competition over cooperation and community is long gone.
More important, it’s refreshing to know that as Angular applications The company before us today certainly competes, but not with its
grow and evolve, the required work to keep them up-to-date with community. The Microsoft of 2017 wants you to use the technol-
the latest versions of Angular is (for the moment, anyway) trivial. ogy stack of your choice, ideally within its cloud or on its OS, but
if you have a different choice than that, well, that’s your choice.
How To Be MEAN: Two Years On At the end of the day, the MEAN stack is “just” a stack made up
While working on this column, MSDN Magazine Editor in Chief of three parts (MongoDB, Angular and Node.js/Express) that can
Michael Desmond pointed out that my How To Be MEAN series interoperate with one another. And the fact that Microsoft not only
was turning 2 years old as of this issue. How is it that I’m still work- embraces that, but encourages it, tells you just how far things have
ing in the MEAN mines? Some of it has to do with the fact that this come from where it was before.
series is attacking a rather large subject—a complete soup-to-nuts, Kind of makes you wonder what the next few years have in store
front-end-to-data-storage, REST API middleware-based platform, for us, doesn’t it? Happy coding! n
rather than just a library or framework. But some of it has to do
with the nature of the MEAN stack itself. Ted Neward is a Seattle-based polytechnology consultant, speaker and mentor,
currently working as the director of developer relations at Smartsheet.com. He
You see, the MEAN platform is different from the .NET has written more than 100 articles, authored and coauthored a dozen books,
Framework platform not in terms of what it provides—both have a and works all over the world. Reach him at [email protected] or read his blog
programming language, an HTTP library/framework for receiving at blogs.tedneward.com.

68 msdn magazine The Working Programmer

0817msdn_NewardWProg_v3_66-68.indd 68 7/12/17 11:56 AM


1

Untitled-6 1 3/6/17 2:20 PM


Alachisoft-MSDN-Magazine-Ad-Feb-2017-Ver-1.0
REDMOND
AUGUST 14-18, 2017 ALM
MICROSOFT HEADQUARTERS

START TIME END TIME

9:00 AM 6:00 PM

START TIME END TIME

8:00 AM 12:00 PM

12:00 PM 2:00 PM

2:00 PM 5:30 PM

7:00 PM 9:00 PM
START TIME END TIME
T0
8:00 AM 9:15 AM Vis

T06
9:30 AM 10:45 AM Dis

10:45 AM 11:15 AM

11:15 AM 12:15 PM

12:15 PM 1:30 PM

1:30 PM 2:45 PM
T11 Tak
Good a

T16
3:00 PM 4:15 PM to Ho

4:15 PM 5:45 PM

START TIME END TIME

8:00 AM 9:15 AM
W01

JOIN US AT MICROSOFT HEADQUARTERS


W06
9:30 AM 10:45 AM
Mobile D
a Grea

SUNDAY, AUG 13: PRE-CON


THIS SUMMER 11:00 AM

12:00 PM
12:00 PM

1:30 PM
HANDS-ON LABS ➤ Rub elbows with blue badges W11 B
1:30 PM 2:45 PM Dev.
Choose From:
➤ Angular NEW! ➤ Experience life on campus
2:45 PM 3:15 PM
Only
➤ Dev Ops with
$695!
➤ Enjoy lunch in the Commons W16
ASP.NET Core/EF Core 3:15 PM 4:30 PM Applicat
and visit the Company Store Scott Hanselman, and
➤ SQL Server 2016
➤ And More! Keynote Speaker 6:15 PM 8:30 PM
SPACE IS LIMITED START TIME END TIME

TH0
8:00 AM 9:15 AM Re
P

FULL TRACK OF MICROSOFT 9:30 AM 10:45 AM


TH0
And

SESSIONS NOW UPDATED!


Mat

Register by August 14 TH11

and Save $300!*


There are many perks to attending training at 11:00 AM 12:15 PM
Agile E

Microsoft Headquarters, and one of the 12:15 PM 2:15 PM

biggest is getting to hear from Microsoft Use promo code RDEB01 TH16
2:15 PM 3:30 PM Princi
insiders who are “in the trenches”, working Im

TH21
with developers just like you on a daily basis. *Only 3:45 PM 5:00 PM Inj
available
on 3- and START TIME END TIME
Microsoft Speakers are noted with a
5-day 8:00 AM 5:00 PM
packages.

EVENT PARTNER GOLD SPONSORS SILVER SPONSOR SUPPORTED BY PRODUCED BY

magazine

Untitled-4 2 7/13/17 1:11 PM


REDMOND AGENDA AT-A-GLANCE
Cloud Database and Native Software Visual Studio /
ALM / DevOps Web Client Web Server
Computing Analytics Client Practices .NET Framework

ND TIME NEW Full Day Hands-On Labs: Sunday, August 13, 2017 (Separate entry fee required)

00 PM
HOL01 Full Day Hands-On Lab: Busy Developer’s HOL02 Full Day Hands-On Lab: DevOps with ASP.NET HOL03 Full Day Hands-On Lab: Developer Dive
HOL on Angular - Ted Neward Core and EF Core - Benjamin Day & Brian Randell into SQL Server 2016 - Leonard Lobel
ND TIME Visual Studio Live! Pre-Conference Workshops: Monday, August 14, 2017 (Separate entry fee required)
2:00 PM
M01 Workshop: Modern Security Architecture M02 Workshop: Distributed Cross-Platform Application M03 Workshop: Big Data, BI and Analytics
for ASP.NET Core - Brock Allen Architecture - Jason Bock & Rockford Lhotka on The Microsoft Stack - Andrew Brust
00 PM Lunch @ The Mixer - Visit the Microsoft Company Store & Visitor Center
M02 Workshop Continues
30 PM M01 Workshop Continues - Brock Allen M03 Workshop Continues - Andrew Brust
- Jason Bock & Rockford Lhotka
00 PM Dine-A-Round Dinner
ND TIME Visual Studio Live! Day 1: Tuesday, August 15, 2017
T01 Go Mobile With C#, T03 New SQL Server 2016 T05 The Future of C#
T02 Angular 101: Part 1 T04 Building an Agile Culture
15 AM Visual Studio, and Xamarin Security Features for - Dustin Campbell
- Deborah Kurata that Scales - Aaron Bjork
- James Montemagno Developers - Leonard Lobel & Mads Torgersen
T06 Building Connected and T08 No Schema, No Problem! T10 Building Apps with
T07 Angular 101: Part 2 T09 Getting to the Core of
:45 AM Disconnected Mobile Apps Introduction to Azure Microsoft Graph and Visual
- Deborah Kurata .NET Core - Adam Tuliper
- James Montemagno DocumentDB - Leonard Lobel Studio - Robert Green
:15 AM Sponsored Break - Visit Exhibitors
2:15 PM KEYNOTE: Microsoft’s Open Source Developer Journey - Scott Hanselman, Principal Community Architect for Web Platform and Tools, Microsoft
30 PM Lunch - Visit Exhibitors
T12 Assembling the Web— T13 Unit Testing & Test-Driven T14 Mobile DevOps with T15 Azure for .NET
T11 Take the Tests: Can You Evaluate
:45 PM A Tour of WebAssembly Development (TDD) for the Microsoft Stack Developers: In Plain English
Good and Bad Designs? - Billy Hollis
- Jason Bock Mere Mortals - Benjamin Day - Abel Wang - Michael Crump
T16 A Developers Introduction T17 Spans, Memory, T18 Developing for Windows T19 Entity Framework Core T20 Debugging Tips and
:15 PM to HoloLens - Billy Hollis & Brian and Channels—Making and Linux Side by Side for Enterprise Applications Tricks for Visual Studio
Randell .NET Code Fast - Jason Bock - Gilles Khouzam - Benjamin Day - Kaycee Anderson
45 PM Microsoft Ask the Experts & Exhibitor Reception Sponsored by
ND TIME Visual Studio Live! Day 2: Wednesday, August 16, 2017
W02 Migrating to W04 Distributed Architecture: W05 Architecting Big Data
W01 Roll Your Own Dashboard W03 Hacker Trix - Learning from
15 AM ASP.NET Core—A True Microservices and Messaging Solutions with Azure
in XAML - Billy Hollis OWASP Top 10 - Mike Benkovich
Story - Adam Tuliper - Rockford Lhotka - Michael Rys
W06 Customizing Your UI for
W07 User Authentication for W08 From Containers to Data in W10 Agile: You Keep
Mobile Devices: Techniques to Create W09 ASP.NET Core 2.0
:45 AM ASP.NET Core MVC Applications Motion, Tour d’Azure 2017 - Mike Using That Word...
a Great User Experience - Laurent - Jass Bagga
- Brock Allen Benkovich - Philip Japikse
Bugnion
2:00 PM GENERAL SESSION: Amplifying Human Ingenuity with Microsoft AI - Paul Stubbs, Director of Product Marketing for AI and Bots, Microsoft
30 PM Birds-of-a-Feather Lunch - Visit Exhibitors
W11 Building Cross-platform App. W14 TypeScript and the Future W15 Agile Failures:
W12 Securing Web APIs in W13 Tactical DevOps with VSTS
:45 PM Dev. with CLSA.NET - Rockford of JavaScript - Jordan Matthiesen Stories from The Trenches
ASP.NET Core - Brock Allen - Brian Randell
Lhotka & Bowden Kelly - Philip Japikse
:15 PM Sponsored Break - Exhibitor Raffle @ 2:55 pm (Must be present to win)
W20 Using Angular 2,
W16 Building Truly Universal W19 SOLID—The Five
W17 Integrating AngularJS & W18 Get Started with Git JavaScript, and TypeScript
30 PM Applications with Windows, Xamarin Commandments of Good
ASP.NET MVC - Miguel Castro and GitHub - Robert Green to Build Fast and Secure
and MVVM - Laurent Bugnion Software - Chris Klug
Mobile Apps - Jordan Matthiesen
30 PM Set Sail! VSLive!’s Seattle Sunset Cruise - Advanced Reservation & $10 Fee Required
ND TIME Visual Studio Live! Day 3: Thursday, August 17, 2017
TH03 “Aurelia vs “Just
TH01 Lessons Learned from TH02 Build Real-Time
Angular” a.k.a “The TH04 Go Serverless with Azure TH05 Git at Microsoft
15 AM Real World Xamarin.Forms Websites and Apps with
Framework Formerly Known Functions - Eric D. Boyd Scale - Edward Thomson
Projects - Nick Landry SignalR - Rachel Appel
as Angular 2” - Chris Klug
TH07 Database Lifecycle
TH06 Creating Great Looking TH08 Hard Core
Management and the TH09 Breaking Down Walls with TH10 Microsoft Set List:
:45 AM Android Applications Using ASP.NET Core
SQL Server Database Modern Identity - Eric D. Boyd Details Dropping Soon
Material Design - Kevin Ford - Rachel Appel
- Brian Randell
TH12 Bots are the New Apps: TH15 What’s New in
TH13 Power BI: Analytics for TH14 Enriching MVC Sites
TH11 Software Engineering in an Building Bots with ASP.NET Visual Studio 2017 for
2:15 PM Desktop, Mobile and Cloud with Knockout JS
Agile Environment - David Corbin WebAPI & Language C# Developers
- Andrew Brust - Miguel Castro
Understanding - Nick Landry - Kasey Uhlenhuth
:15 PM Lunch @ The Mixer - Visit the Microsoft Company Store & Visitor Center
TH20 Serverless with Azure
TH16 Classic Software Design TH18 Big Data with Hadoop, TH19 Extend and Customize
TH17 Getting Started with Functions—Scale Dynamically
30 PM Principles and Why They Are Still Spark and Azure HDInsight the Visual Studio Environment
Aurelia - Brian Noyes and Pay per Execution
Important- David Corbin - Andrew Brust - Walt Ritscher
- Donna Malayeri
TH21 End-to-End Dependency TH22 Everything You Need TH23 Continuous Integration and TH24 Windows Package TH25 Securing Client Apps
00 PM Injection & Testable Code to Know About Package Deployment for Mobile using Azure Management with NuGet and with IdentityServer
- Miguel Castro Management - Alex Mullans Services - Kevin Ford Chocolatey - Walt Ritscher - Brian Noyes
ND TIME Visual Studio Live! Post-Conference Workshops: Friday, August 18, 2017 (Separate entry fee required)
00 PM F01 Workshop: Building Modern Web Apps with Azure - Eric D. Boyd F02 Workshop: Data-Centric Single Page Apps with Aurelia, Breeze, and Web API - Brian Noyes

Speakers and sessions subject to change

CONNECT WITH US vslive.com/redmondmsdn


twitter.com/vslive – facebook.com – linkedin.com – Join the
@VSLive Search “VSLive” “Visual Studio Live” group!

Untitled-4 3 7/13/17 1:11 PM


Essential .NET MARK MICHAELIS

C# 7.0: Tuples Explained

Back in November, in the Connect(); special issue, I provided tuple on the right, the assignment to the left deconstructs the tuple
an overview of C# 7.0 ( msdn.microsoft.com/magazine/mt790178), in into its constituent parts. In example 2, the left-hand-side assign-
which I introduced tuples. In this article, I delve into tuples again, ment is to pre-declared variables. However, in examples 1, 3 and
covering the full breadth of the syntax options. 4, the variables are declared within the tuple syntax. Given that
To begin, let’s consider the question: Why tuples? On occasion, I’m only declaring variables, the naming and casing convention
you’ll likely find it useful to combine data elements. Suppose, for follows the generally accepted Framework Design Guidelines—“Do
example, you’re working with information about countries, such use camelCase for local variable names,” for example.
as the poorest country in the world in 2017: Malawi, whose capital Note that although implicit typing (var) can be distributed
is Lilongwe, with a gross domestic product (GDP) per capita of across each variable declaration within the tuple syntax, as shown
$226.50. You could obviously declare a class for this data, but it in example 4, you can’t do the same with an explicit type (such as
doesn’t really represent your typical noun/object. It’s seemingly string). In this case, you’re actually declaring a tuple type, not just
more a collection of related pieces of data than it is an object. using tuple syntax and, therefore, you’ll need to add a reference to
Surely, if you were going to have a Country object, for example, the System.ValueType NuGet package—at least until .NET Stan-
it would have considerably more data than just properties for the dard 2.0. Because tuples allow each item to be a different data type,
Name, Capital and GDP per capita. Alternatively, you could store distributing the explicit type name across all elements wouldn’t
each data element in individual variables, but the result would be necessarily work unless all the item data types were identical (and
no association between the data elements; $226.50 would have even then, the compiler doesn’t allow it).
no association with Malawi except perhaps by a common suffix In example 5, I declare a tuple on the left-hand side and then
or prefix in the variable names. Another option would be to com- assign the tuple on the right. Note that the tuple has named
bine all the data into a single string—with the disadvantage that to items—names you can then reference to retrieve the item values
work with each data element individually would require parsing back out of the tuple. This is what enables the countryInfo.Name,
it out. A final approach might be to create an anonymous type, countryInfo.Capital, and countryInfo.GdpPerCapita syntax in
but that, too, has limitations; enough, in fact, that tuples could the System.Console.WriteLine statement. The result of the tuple
potentially replace anonymous types entirely. I’ll leave this topic declaration on the left is a grouping of the variables into a single
until the end of the article. variable (countryInfo) from which you can then access the con-
The best option might be the C# 7.0 tuple, which, at its simplest, stituent parts. This is useful because you can then pass this single
provides a syntax that allows you to combine the assignment of variable around to other methods and those methods will also be
multiple variables, of varying types, in a single statement: able to access the individual items within the tuple.
(string country, string capital, double gdpPerCapita) = As already mentioned, variables defined using tuple syntax use
("Malawi", "Lilongwe", 226.50);
camelCase. However, the convention for tuple item names isn’t
In this case, I’m not only assigning multiple variables, but
well-defined. Suggestions include using parameter-naming con-
declaring them as well.
ventions when the tuple behaves like a parameter —such as when
However, tuples have several other additional syntax possibilities,
returning multiple values that before tuple syntax would’ve used
each shown in Figure 1.
out parameters. The alternative is to use PascalCase, following
In the first four examples, and although the right-hand side rep-
the naming convention for public fields and properties. I strongly
resents a tuple, the left-hand side still represents individual variables
favor the latter approach in accordance with the Capitalization
that are assigned together using tuple syntax, which involves two or
Rules for Identifiers (itl.tc/caprfi). Tuple item names are rendered
more elements separated by commas and associated with parenthe-
as members of the tuple and the convention for all (public)
ses. (I use the term tuple syntax because the underlying data type
members (which are potentially accessed using a dot operator)
the compiler generates on the left-hand side isn’t technically a
is PascalCase.
tuple.) The result is that although I start with values combined as a
Example 6 provides the same functionality as example 5, although
Code download available at  itl.tc/MSDN.2017.08.
it uses named tuple items on the right-hand side tuple value and an
implicit type declaration on the left. The items’ names are persisted

72 msdn magazine

0817msdn_MichaelisNET_v3_72-75.indd 72 7/12/17 11:55 AM


Untitled-5 1 7/6/17 3:22 PM
to the implicitly typed variable, however, so they’re still available are no properties for ItemX, but rather read-write fields (seemingly
for the WriteLine statement. Of course, this opens the possibility breaking the most basic of .NET Programming Guidelines as
that you could name the items on the left-hand side with names explained at itl.tc/CS7TuplesBreaksGuidelines).
that are different from those you use on the right. While the C# In addition to the programming guidelines discrepancy, there’s
compiler allows this, it will issue a warning that the item names another behavioral question that arises. Given that the custom
on the right will be ignored as those on the left take precedence. item names and their types aren’t included in the System.Value­
If no item names are specified, the individual elements are still Tuple<...> definition, how is it possible that each custom item
available from the assigned tuple variable. However, the names are name is seemingly a member of the System.ValueTuple<...> type
Item1, Item2 and so on, as shown in example 7. In fact, the ItemX and accessible as a member of that type?
name is always available on the tuple—even when custom names What’s surprising (particularly for those familiar with the anon-
are provided (see example 8). However, when using IDE tools like ymous type implementation) is that the compiler doesn’t generate
any of the recent flavors of Visual Studio that support C# 7.0, the underlying Common Intermediate Language (CIL) code for the
ItemX property will not appear within the IntelliSense dropdown—a members corresponding to the custom names. However, even
good thing because presumably the provided name is preferable. without an underlying member with the custom name, there is
As shown in example 9, portions of a tuple assignment can be (seemingly) from the C# perspective, such a member.
excluded using an underscore; this is
called a discard. Figure 1 Sample Code for Tuple Declaration and Assignment
Tuples are a lightweight solution Example Description Example Code
for encapsulating data into a single 1. Assigning a tuple to (string country, string capital, double gdpPerCapita) =
object in the same way that a bag individually declared ("Malawi", "Lilongwe", 226.50);
System.Console.WriteLine(
might capture miscellaneous items variables. $@"The poorest country in the world in 2017 was {
you pick up from the store. Unlike country}, {capital}: {gdpPerCapita}");
arrays, tuples contain item data 2. Assigning a tuple to string country;
individually declared variables string capital;
types that can vary virtually without double gdpPerCapita;
constraint (although pointers aren’t that are pre-declared.
(country, capital, gdpPerCapita) =
allowed), except that they’re identi- ("Malawi", "Lilongwe", 226.50);
fied by the code and can’t be changed System.Console.WriteLine(
$@"The poorest country in the world in 2017 was {
at run time. Also, unlike with arrays, country}, {capital}: {gdpPerCapita}");
the number of items within the 3. Assigning a tuple to (var country, var capital, var gdpPerCapita) =
tuple is hardcoded at compile time, individually declared and ("Malawi", "Lilongwe", 226.50);
System.Console.WriteLine(
as well. Last, you can’t add custom implicitly typed variables. $@"The poorest country in the world in 2017 was {
behavior to a tuple (extension meth- country}, {capital}: {gdpPerCapita}");
ods notwithstanding). If you need 4. Assigning a tuple to var (country, capital, gdpPerCapita) =
individually declared variables ("Malawi", "Lilongwe", 226.50);
behavior associated with the encap- System.Console.WriteLine(
sulated data, then leveraging object-­ that are implicitly typed with $@"The poorest country in the world in 2017 was {
a distributive syntax. country}, {capital}: {gdpPerCapita}");
oriented programing and defining a
class is the preferred approach. 5. Declaring a named item tuple (string Name, string Capital, double GdpPerCapita) countryInfo =
and assigning it tuple values ("Malawi", "Lilongwe", 226.50);
System.Console.WriteLine(
and then accessing the tuple $@"The poorest country in the world in 2017 was {
The System.ValueTuple<…> items by name. countryInfo.Name}, {countryInfo.Capital}: {
countryInfo.GdpPerCapita}");
Type
6. Assigning a named item tuple var countryInfo =
The C# compiler generates code that (Name: "Malawi", Capital: "Lilongwe", GdpPerCapita: 226.50);
to a single implicitly typed
relies on a set of generic value types variable that’s implicitly typed
System.Console.WriteLine(
$@"The poorest country in the world in 2017 was {
(structs), such as System.Value­ and then accessing the tuple countryInfo.Name}, {countryInfo.Capital}: {
Tuple<T1, T2, T3>, as the underlying items by name. countryInfo.GdpPerCapita}");

implementation for the tuple syntax 7. Assigning an unnamed tuple var countryInfo =
("Malawi", "Lilongwe", 226.50);
for all tuple instances on the right- to a single implicitly typed
System.Console.WriteLine(
hand side of the examples in variable and then accessing $@"The poorest country in the world in 2017 was {
the tuple elements by their countryInfo.Item1}, {countryInfo.Item2}: {
Figure 1. Similarly, the same set of
Item-number property. countryInfo.Item3}");
System.ValueTuple<...> generic value
8. Assigning a named item tuple var countryInfo =
types is used for the left-hand-side to a single implicitly typed (Name: "Malawi", Capital: "Lilongwe", GdpPerCapita: 226.50);
data type starting with example 5. variable and then accessing
System.Console.WriteLine(
$@"The poorest country in the world in 2017 was {
As you’d expect with a tuple type, the tuple items by their countryInfo.Item1}, {countryInfo.Item2}: {
the only methods included are those Item-number property. countryInfo.Item3}");

related to comparison and equality. 9. Discard portions of the tuple (string name, _, double gdpPerCapita) countryInfo =
with underscores. ("Malawi", "Lilongwe", 226.50);
However, perhaps unexpectedly, there
74 msdn magazine Essential .NET

0817msdn_MichaelisNET_v3_72-75.indd 74 7/12/17 11:55 AM


For all the named tuple local variable examples, for example: • For all practical purposes, C# developers can essentially
var countryInfo = (Name: "Malawi", Capital: "Lilongwe", GdpPerCapita: 226.50) ignore System.ValueTuple and System.ValueTuple<T>.
it’s clearly possible that the names could be known by the compiler There’s another tuple type that was included with the .NET
for the remainder of the scope of the tuple because that scope is Framework 4.5—System.Tuple<…>. At that time, it was expected
bounded within the member in which it’s declared. And, in fact, to be the core tuple implementation going forward. However,
the compiler (and IDE) quite simply rely on this scope to allow once C# supported tuple syntax, it was realized that a value type
accessing each item by name. In other words, the compiler looks generally performed better and so System.ValueTuple<…> was
at the item names within the tuple declaration and leverages them introduced, effectively replacing System.Tuple<…> in all cases
to allow code that uses those names within the scope. It’s for this except for backward compatibility with existing APIs that depend
reason, as well, that the ItemX methods aren’t shown in the IDE on System.Tuple<…>.
IntelliSense as available members on the tuple (the IDE simply
ignores them and replaces them with the named items). Wrapping Up
Determining the item names from when scoped within a mem- What many folks didn’t realize when it was first introduced is
ber is reasonable for the compiler, but what happens when a tuple that the new C# 7.0 tuple all but replaces anonymous types—and
is exposed outside the member—such as a parameter or return provides additional functionality. Tuples can be returned from
from a method that’s in a different assembly (for which there’s methods, for example, and the item names are persisted in the
possibly no source code available)? For all tuples that are part of API such that meaningful names can be used in place of ItemX
the API (whether a public or private API), the compiler adds item type naming. And, like anonymous types, tuples can even repre-
names to the metadata of the member in the form of attributes. sent complex hierarchical structures such as those that might be
For example, this: constructed in more complex LINQ queries (albeit, like with
[return: System.Runtime.CompilerServices.TupleElementNames( anonymous types, developers should do this with caution). That
new string[] {"First", "Second"})]
public System.ValueTuple<string, string> ParseNames(string fullName) said, this could possibly lead to situations where the tuple value
{ type exceeds 128 bytes and, therefore, might be a corner case for
// ...
} when to use anonymous types because it’s a reference type. Except
is the C# equivalent of what the compiler generates for the following: for these corner cases (accessing via typical reflection might be
public (string First, string Second) ParseNames(string fullName) another example), there’s little to no reason to use an anonymous
On a related note, C# 7.0 doesn’t enable the use of custom item type when programming with C# 7.0 or later.
names when using the explicit System.ValueTuple<…> data type. The ability to program with a tuple type object has been around
Therefore, if you replace var in Example 8 of Figure 1, you’ll end for a long time (as mentioned, a tuple class, System.Tuple<…>,
up with warnings that each item name will be ignored. was introduced with the .NET Framework 4, but was available
Here are a few additional miscellaneous facts to keep in mind in Silverlight before that). However, these solutions never had an
about System.ValueTuple<…>: accompanying C# syntax, but rather nothing more than a .NET
• There are a total of eight generic System.ValueTuple structs API. C# 7.0 brings a first-class tuple syntax that enables literals—
corresponding to the possibility of supporting a tuple with up like var tuple = (42, “Inigo Montoya”)—implicit typing, strong
to seven items. For the eighth tuple, System.ValueTuple<T1, typing, public API utilization, integrated IDE support for named
T2, T3, T4, T5, T6, T7, TRest>, the last type parameter ItemX data and more. Admittedly, it might not be something you
allows specifying an additional value tuple, thus enabling use in every C# file, but it’s likely something you’ll be grateful to
support for n items. If, for example, you specify a tuple with have when the need arises and you’ll welcome the tuple syntax over
8 parameters, the compiler will automatically generate a the alternative out parameter or anonymous type.
System.ValueTuple<T1, T2, T3, T4, T5, T6, T7, System.Value­ Much of this article derives from my “Essential C#” book
Tuple<TSub1>> as the underlying implementing type. (For (IntelliTect.com/EssentialCSharp), which I’m currently in the midst of
completeness, System.Value<T1> exists, but will really only updating to “Essential C# 7.0.” For more information on this topic,
be used directly and only as a type. It will never be used check out Chapter 3. n
directly by the compiler because the C# tuple syntax requires
a minimum of two items.)
Mark Michaelis is founder of IntelliTect, where he serves as its chief technical
• There is a non-generic System.ValueTuple that serves as a architect and trainer. For nearly two decades he’s been a Microsoft MVP, and
tuple factory with Create methods corresponding to each a Microsoft Regional Director since 2007. Michaelis serves on several Micro­
value tuple arity. The ease of using a tuple literal, such as var soft software design review teams, including C#, Microsoft Azure, SharePoint
t1 = (“Inigo Montoya”, 42), supersedes the Create method at and Visual Studio ALM. He speaks at developer conferences and has written
numerous books, including his most recent, “Essential C# 6.0 (5th Edition)” (itl.tc/­­
least for C# 7.0 (or later) programmers.
EssentialCSharp). Contact him on Facebook at facebook.com/Mark.Michaelis,
on his blog at IntelliTect.com/Mark, on Twitter: @markmichaelis or via e-mail
at [email protected].
TUPLE ITEM NAMING GUIDELINES
Do use camelCase for all variables declared using tuple syntax.
Consider using PascalCase for all tuple item names. Thanks to the following Microsoft technical expert for reviewing this article:
Mads Torgersen
msdnmagazine.com August 2017 75

0817msdn_MichaelisNET_v3_72-75.indd 75 7/12/17 11:55 AM


ROYAL PACIFIC RESORT
AT UNIVERSAL ORLANDO
NOVEMBER 12-17

Coding in Paradise
Grab your flip flops, and your laptops, and make plans to attend Visual Studio Live!
(VSLive!™), the conference more developers rely on to expand their .NET skills and the
ability to build better applications.
Over six full days of unbiased and cutting-edge education on the Microsoft Platform,
developers, engineers, designers, programmers and more will soak in the knowledge on
everything from Visual Studio and the .NET framework, to AngularJS, ASP.NET and Xamarin.

CONNECT WITH LIVE! 360


twitter.com/live360 facebook.com linkedin.com
@live360 Search "Live 360" Join the "Live! 360" group!

EVENT PARTNERS PLATINUM SPONSOR SUPPORTED BY

magazine

Untitled-3 2 7/13/17 12:14 PM


5 GREAT CONFERENCES,
1 GREAT PRICE
Visual Studio Live! Orlando is part
NEW: HANDS-ON LABS of Live! 360, the Ultimate Education
Destination. This means you’ll have
Join us for full-day,
access to four (4) other co-located
pre-conference hands-on events at no additional cost:
labs Sunday, November 12.

Only $595 through August 11

Whether you are an


³ Engineer
³ Developer
³ Programmer
³ Software Architect
³ Software Designer

You will walk away from this event having


expanded your .NET skills and the ability
to build better applications.

REGISTER BY Five (5) events and hundreds of


AUGUST 11 AND sessions to choose from—mix and
REGISTER SAVE $500!* match sessions to create your own,
NOW Use promo code ORLAUG4
custom event line-up—it’s like no
other conference available today!
*Savings based on 5-day packages only.
See website for details.

TURN THE PAGE FOR MORE EVENT DETAILS 


PRODUCED BY

VSLIVE.COM/ORLANDOMSDN

Untitled-2 1 7/13/17 11:24 AM


Untitled-3 3 7/13/17 12:15 PM
AGENDA AT-A-GLANCE

CLOUD NATIVE VISUAL STUDIO /


ALM / DEVOPS SOFTWARE PRACTICES WEB CLIENT
COMPUTING CLIENT .NET FRAMEWORK

START TIME END TIME NEW Full Day Hands-On Labs: Sunday, November 12, 2017
9:00 AM 6:00 PM VSS01 Full Day Hands-On Lab: Busy Developer's HOL on Angular - Ted Neward VSS02 Full Day Hands-On Lab: From 0-60 in a day with Xamarin an

START TIME END TIME Pre-Conference Workshops: Monday, November 13, 2017
VSM03 Workshop: Service Ori
VSM01 Workshop: Distributed Cross-Platform Application Architecture VSM02 Workshop: Artificial Intelligence or DevOps??
8:30 AM 5:30 PM Designing, Developing, & Impleme
- Jason Bock & Rockford Lhotka - Brian Randell
- Miguel Ca
6:30 PM 8:00 PM Dine-A-Round Dinner @ Universal CityWalk - 6:30pm - Meet at Conference Registration Desk to walk over with the group

START TIME END TIME Day 1: Tuesday, November 14, 2017

8:00 AM 9:00 AM Visual Studio Live! KEYNOTE: To Be Announced

VST01 Front-end Web Development in 2017 for


VST02 Go Mobile with C#, Visual Studio, and Xamarin VST03 Microservices with Azure Container V
9:15 AM 10:30 AM the Front-endally Challenged Microsoft Developer
- James Montemagno Service & Service Fabric - Vishwas Lele Visual
- Chris Klug
10:30 AM 11:00 AM Networking Break • Visit the EXPO - Pacifica 7

11:00 AM 12:00 PM LIVE! 360 KEYNOTE: To Be Announced - Pacifica 6


12:00 PM 12:45 PM Lunch • Visit the EXPO

12:45 PM 1:30 PM Dessert Break • Visit the EXPO


VST05 ASP.NET Core MVC—What You Need to Know VST06 Optimizing and Extending Xamarin.Forms VST07 Tactical DevOps with Visual Studio
1:30 PM 2:45 PM VS
- Philip Japikse Mobile Apps - James Montemagno Team Services - Brian Randell
2:45 PM 3:15 PM Networking Break • Visit the EXPO - Pacifica 7
VST12 T
VST09 Angular(2)—The 75-Minute Crash Course VST10 Building a Cross-Platform Mobile App Backend VST11 Database Lifecycle Management and the SQL
3:15 PM 4:30 PM Features
- Chris Klug in the Cloud - Nick Landry Server Database - Brian Randell

VST14 Fast Focus: Tips & Tricks for Xamarin VST16 F


4:40 PM 5:00 PM VST13 Fast Focus: Aurelia vs. Just Angular - Chris Klug VST15 Fast Focus on Azure Functions - Rachel Appel
Development - James Montemagno
VST18 Fast Focus: Cross-Platform Code Reuse VST19 Fast Focus: Exploring Microservices in VST20 Fa
5:10 PM 5:30 PM VST17 Fast Focus: Web Security 101 - Brock Allen
- Rockford Lhotka a Microsoft Landscape - Marcel de Vries in 20
5:30 PM 7:30 PM Exhibitor Reception - Pacifica 7

START TIME END TIME Day 2: Wednesday, November 15, 2017


VSW01 User Authentication for ASP.NET Core VSW03 Overcoming the Challenges of Mobile VSW04 Bui
8:00 AM 9:15 AM VSW02 Cloud Oriented Programming - Vishwas Lele
MVC Applications - Brock Allen Development in the Enterprise - Roy Cornelissen and Vi
VSW0
VSW05 Building AngularJS Component-Based VSW06 Building Modern Web Apps with Azure VSW07 Creating a Release Pipeline with
9:30 AM 10:45 AM Building
Applications - Miguel Castro - Eric D. Boyd Team Services - Esteban Garcia
Language
10:45 AM 11:30 AM Networking Break • Visit the EXPO - Pacifica 7

11:30 AM 12:30 PM LIVE! 360 KEYNOTE: To Be Announced - Pacifica 6


12:30 PM 1:30 PM Birds-of-a-Feather Lunch

1:30 PM 2:00 PM Dessert Break • Visit the EXPO


VSW10 A/B Testing, Canary Releases and
VSW09 Securing Web APIs in ASP.NET Core VSW11 The Zen of UI Automation Testing
2:00 PM 3:15 PM Dark Launching, Implementing Continuous Delivery on VS
- Brock Allen - Rachel Appel
Azure - Marcel de Vries
3:15 PM 4:00 PM Networking Break • Visit the EXPO • Expo Raffle @ 3:30 p.m. - Pacifica 7
VSW
4:00 PM 5:15 PM
VSW13 Build Object-Oriented Enterprise Apps in VSW14 Lock the Doors, Secure the Valuables, VSW15 Unit Testing & Test-Driven Development Commo
JavaScript with TypeScript - Rachel Appel and Set the Alarm - Eric D. Boyd (TDD) for Mere Mortals - Benjamin Day Business
Applicat
8:00 PM 10:00 PM Live! 360 Dessert Luau - Wantilan Pavilion

START TIME END TIME Day 3: Thursday, November 16, 2017


VSH02 PowerApps and Flow Part II: Package,
VSH01 HTTP/2: What You Need to Know VSH04
8:00 AM 9:15 AM Embed, and Extend Your Applications VSH03 Exploring C# 7 New Features - Adam Tuliper
- Robert Boedigheimer to Great
- Manas Maheshwari & Pratap Ladhani
VSH07 .NET Standard—From Noob to Ninja VSH08
9:30 AM 10:45 AM VSH05 ASP.NET Tag Helpers - Robert Boedigheimer VSH06 Storyboarding 101 - Billy Hollis
- Adam Tuliper with t

11:00 AM 12:00 PM Visual Studio Live! Panel: To Be Announced - Brian Randell (Moderator), Damian Brady, Jeremy Clark, Esteban Garcia, Billy Hollis, & Adam Tuliper

12:00 PM 1:00 PM Lunch on the Lanai - Lanai / Pacifica 7

1:00 PM 2:15 PM
VSH09 I See You: Watching the User with VSH10 Continuous Integration and Deployment VSH11 Deploying Straight to Production: VSH12
Reactive Forms - Deborah Kurata for Mobile Using Azure Services - Kevin Ford A Guide to the Holy Grail - Damian Brady for
VSH14 XAML Inception—Deep Composition VSH15 Application Insights: Measure VSH16
2:30 PM 3:45 PM VSH13 Angular Routing - Deborah Kurata
for Better UI - Billy Hollis Your Way to Success - Esteban Garcia Depend

4:00 PM 5:00 PM Next? Visual Studio Live! Networking Event - Brian Randell (Moderator), Damian Brady, Jeremy Clark, Esteban Garcia, Billy Hollis, & Deborah Kurata

START TIME END TIME Post-Conference Workshops: Friday, November 17, 2017

VSF02 Workshop: Building, Running & Continuou


8:00 AM 5:00 PM VSF01 Workshop: Angular Fundamentals - John Papa
Microservices with Docker Containers on Azure - Marcel de

Speakers and sessions subject to change

L360_VSL17_4pg_ad_0817_f.indd
Untitled-3 4 4 7/13/17 12:15
7/13/17 9:48 AM
PM
Presented in
Partnership with

ROYAL PACIFIC RESORT


AT UNIVERSAL ORLANDO
NOVEMBER 12-17

WEB SERVER MODERN APPS LIVE! Check Out These Additional Sessions
for Developers at Live! 360
h Xamarin and Xamarin.Forms - Roy Cornelissen

Pre-Con Workshops: Monday, Nov. 13


p: Service Oriented Technologies:
MAM01 Workshop: Building Modern Mobile Apps
g, & Implementing WCF and the Web API
- Brent Edwards & Kevin Ford
- Miguel Castro
2IÀFH 6KDUH3RLQW/LYHIHDWXUHV
Dine-A-Round Dinner
15+ developer sessions, including:
Day 1: Tuesday, November 14, 2017
Modern Apps Live! KEYNOTE PANEL:
› NEW! Full Day Hands-On Lab:
Industry Trends, Technology, and Your
Developing Extensions for Microsoft
Career - Matt Lockhart (Moderator)
Teams - Paul Schaeflein
› Workshop: Mastering the SharePoint Framework- Andrew Connell
VST04 What's New in MAT01 Modern App Development: Transform How
Visual Studio 2017 - Robert Green You Build Web and Mobile Software - Rockford Lhotka › TypeScript for SharePoint Developers -Rob Windsor
1HWZRUNLQJ%UHDN‡9LVLWWKH(;323DFL¿FD
› Building Office Add-ins for Outlook with Angular - Andrew Connell
› Developing SharePoint Framework Components Using Visual
LIVE! 360 KEYNOTE Studio - Paul Schaeflein
/XQFK‡9LVLWWKH(;32 › What Every Developer Needs to Know about SharePoint
'HVVHUW%UHDN‡9LVLWWKH(;32 Development Online or On-Prem - Robert Bogue
MAT02 Architecture: The Key to Modern
› Build a Complete Business Solution Using Microsoft Graph API
VST08 To Be Announced through Client Side Web Parts - Julie Turner
App Success - Brent Edwards
1HWZRUNLQJ%UHDN‡9LVLWWKH(;323DFL¿FD
VST12 Top 10 Entity Framework Core MAT03 Modern Mobile Development: Build
Features Every Developer Should Know a Single App For iOS, Android, and Windows with
- Philip Japikse Xamarin Forms - Kevin Ford
VST16 Fast Focus: Busting .NET Myths MAT04 Fast Focus: Hybrid Web Frameworks
- Jason Bock - Allen Conway
VST20 Fast Focus: Dependency Injection MAT05 Fast Focus: Web Assembly SQL Server Live! features 30+
in 20 Minutes - Miguel Castro - Jason Bock developer sessions, including:
Exhibitor Reception - 3DFL¿FD
› NEW! Full Day Hands-On Lab:
Day 2: Wednesday, November 15, 2017 Developer Dive into SQL Server 2016
VSW04 Building Apps with Microsoft Graph MAW01 Focus on the User Experience #FTW › Turbo Boost - SQL Tricks Everybody
and Visual Studio - Robert Green - Jim Barrett MUST Know - Pinal Dave
VSW08 Bots are the New Apps:
Building Bots with ASP.NET Web API &
MAW02 DevOps, Continuous Integration, › Advanced SSIS Package Authoring with Biml - Tim Mitchell
the Cloud, and Docker - Dan Nordquist
Language Understanding - Nick Landry › Graph DB Support in SQL Server 2017 - Karen Lopez
1HWZRUNLQJ%UHDN‡9LVLWWKH(;323DFL¿FD
› Big Data Technologies: What, Where and How to Run Them on
LIVE! 360 KEYNOTE Azure - Andrew Brust
Birds-of-a-Feather Lunch › Top Five SQL Server Query Tuning Tips - Janis Griffin
› Workshop: Big Data, BI, and Analytics on The Microsoft Stack
'HVVHUW%UHDN‡9LVLWWKH(;32
- Andrew Brust
MAW03 Security with Speed for Modern Developers
VSW12 To Be Announced
- Michael Lester

1HWZRUNLQJ%UHDN‡9LVLWWKH(;32‡5DI¿H#SP
VSW16 PowerApps, Flow, and
Common Data Service: Empowering MAW04 Coding for Quality and Maintainability
Businesses with the Microsoft Business - Jason Bock
Application Platform - Charles Sterling
TechMentor features 20 +
Live! 360 Dessert Luau - Wantilan Pavilion
developer sessions, including:
Day 3: Thursday, November 16, 2017
› NEW! Full Day Hands-On Lab:
MAH01 Modern Web Development: Building Server Side Ethical Hacking with Kali Linux
VSH04 Top 10 Ways to Go from Good
Using ASP.NET Core, MVC, Web API, and Azure
to Great Scrum Master - Benjamin Day
- Allen Conway - Mike Danseglio & Avril Salter
VSH08 Devs vs. Ops: Making Friends MAH02 Modern Web Development: Building Client Side › Workshop: Windows Security—How I Do It!
with the Enemy - Damian Brady Using TypeScript and Angular - Allen Conway - Sami Laiho
Modern Apps Live! Panel: Mobile Development › Hardware, Camtasia, and a Storyline: Creating Your Own
Technologies - Rockford Lhotka (Moderator),
James Montemagno, Kevin Ford User Training - Greg Shields
Lunch on the Lanai - /DQDL3DFL¿FD 7
› Make Your PowerShell Scripts Bulletproof with Pester
- Melissa Januszko
VSH12 Design Patterns: Not Just MAH03 Manage Distributed Teams with Visual Studio
for Architects - Jeremy Clark Team Services and Git - Brian Randell › Controlling Your Azure Spend - Timothy Warner
VSH16',:K\"*HWWLQJD*ULSRQ MAH04 Using All That Data: Power BI to the Rescue › PowerShell Scripting Secrets - Jeffery Hicks
Dependency Injection - Jeremy Clark - Scott Diehl
› In-Depth Introduction to Docker - Neil Peterson
1H[W"0RGHUQ$SSV/LYH1HWZRUNLQJ(YHQW
- Rockford Lhotka (Moderator)

Post-Con Workshops: Friday, Nov. 17


MAF01 Workshop: Modern App Deep Dive—
& Continuously Deploying
Xamarin, Responsive Web, UWP
e - Marcel de Vries & Rene van Osnabrugge
- Kevin Ford, Brent Edwards, Allen Conway
VSLIVE.COM/ORLANDOMSDN

Untitled-1 1 7/13/17 12:17 PM


Untitled-3 5 7/13/17 12:19 PM
Don’t Get Me Started DAVID S. PLATT

Salt and Pepper

We’re currently celebrating the 50th anniversary of the classic to the Grateful Dead, that’s not such a bad thing. But sometimes
Beatles album “Sgt. Pepper’s Lonely Hearts Club Band” (SPLHCB). it’s not so good: allowing them to reach the age of majority with-
Its beautiful strains brighten my office atmosphere as I write these out experiencing the spare, wandering piano at the end of Bruce
words. Other writers may address its groundbreaking musical Springsteen’s “Incident on 57th Street” segueing straight into the
effects (see the interview with the recording engineer at bit.ly/2rEetU0), first crashing chords of “Rosalita” would be abdicating my duty
or its place in the evolution of rock music. Or its spoofs, from as a parent. They ask me, “Daddy, what was all the fuss about the
Doonesbury ( bit.ly/2sjceHR) to National Lampoon ( bit.ly/2sHv- Beatles’ White Playlist?” and I’m not sure what to tell them. Some
CAN). But contemplating Sgt. Pepper today makes me notice the things are worth digging for.
ways in which changes in listening technology have driven changes
in musical artistry.
At the time of SPLHCB’s release, essentially all music was sold on The digital revolution—the
LP albums. You had to buy the whole package, and listen to all of its
songs sequentially. The progression to cassette tapes and then to CDs liberation of pure thought-stuff
didn’t change that constraint. Therefore, the artist had to carefully
compose the sequence of songs on the album, as their influence on from the profane physical
each other was inescapable. The Beatles placed George Harrison’s
introspective, sitar-laden “Within You, Without You” ahead of Paul medium on which it resided—
McCartney’s whimsical “When I’m Sixty-Four,” driven by its trio of
clarinets. Reversing that order would have induced entirely differ- undid these artistic decisions.
ent feelings in even a casual listener. They carefully slotted Ringo’s
“With a Little Help From My Friends” into the second track, where I can hear you thinking: Plattski, you always were a Luddite in this
it would do the least damage, and gave it introductory applause industry, failing to worship technology for its own sake as we all do,
effects to pre-dispose the audience’s perceptions toward approval. insisting on a practical benefit before you’d jump on any bandwagon
The digital revolution—the liberation of pure thought-stuff (see, for example, my March 2016 column, “The Internet of Invisible
from the profane physical medium on which it resided—undid Things,” msdn.com/magazine/mt683803). But now you’re going positively
these artistic decisions. Online stores such as iTunes and Amazon Amish on us. Your nostalgia for the original artistic sequences is
sold individual songs, so you didn’t have to buy the bad ones. Any like an old jeweler moaning over the loss of those beautiful metal
listener could easily rip CDs to disk, composing playlists that mixed components in a mechanical watch, when a simple quartz oscilla-
and matched tracks and artists in any order. We lost that part of tor keeps far better time for a tenth of the price.
the artist’s intention. I’m not saying you shouldn’t make your own playlists, I certainly
And that liberation/loss doesn’t solely affect the album’s song make mine. (Try my Trop Rock playlist that Spotify automatically
sequences. It also ripples through the content of individual songs. An exports to my Facebook page.) But the 50th anniversary of Sgt.
artist releasing an album today can’t know which track the listener Pepper reminds me to carefully examine the original artists’
is hearing before or after any song. Therefore, each song needs to be chosen sequence and content, especially for albums which pre-
an island unto itself, rather than part of an artistic whole. How can date easy ripping and self-composition. I expect some cheers for
anyone compose or play the final orchestral crescendo in “A Day this idea now, and even more in two years, when we celebrate the
in the Life” (bit.ly/1LLne4Z), terminating in the world’s most famous 50th anniversary of “Abbey Road.” n
piano chord, without intending to signal the end of the larger work
to which it belongs? (That remains my biggest dilemma on playing David S. Platt teaches programming .NET at Harvard University Extension School
Sgt. Pepper from end to end: What the heck do I play next?) and at companies all over the world. He’s the author of 11 programming books, includ-
ing “Why Software Sucks” (Addison-Wesley Professional, 2006) and “Introducing
Sometimes this liberation from pre-imposed order is good. Microsoft .NET” (Microsoft Press, 2002). Microsoft named him a Software Legend
When I search Spotify for an artist, it will by default play a shuffle in 2002. He wonders whether he should tape down two of his daughter’s fingers so
of that artist’s most popular tunes. If I’m introducing my daughters she learns how to count in octal. You can contact him at rollthunder.com.

80 msdn magazine

0817msdn_PlattDGMS_v3_80.indd 80 7/12/17 11:56 AM


Untitled-3 1 1/13/17 11:32 AM
Untitled-1 1 7/7/17 10:57 AM
File Format APIs

Powerful File APIs that are easy and intuitive to use


Native APIs for .NET, Java & Cloud

Using Aspose.Words for .NET to


Convert Word Docs to HTML -
Case Study
DOC, XLS, JPG,
PNG, PDF, BMP,
MSG, PPT, VSD,
Adding File Conversion and
XPS & many other
Manipulation to Business Systems formats.

www.aspose.com

EU Sales: +44 141 628 8900 US Sales: +1 903 306 1676 AU Sales: +61 2 8006 6987
[email protected]
Aspose.Total
Every Aspose API combined in one powerful suite.

Aspose.Cells Aspose.BarCode
XLS, CSV, PDF, SVG, HTML, PNG JPG, PNG, BMP, GIF, TIFF, WMF
BMP, XPS, JPG, SpreadsheetML... ICON...

Aspose.Words Aspose.Tasks
DOC, RTF, PDF, HTML, PNG XML, MPP, SVG, PDF, TIFF
ePub, XML, XPS, JPG... PNG...

Aspose.Pdf Aspose.Email
PDF, XML, XSL-FO, HTML, BMP MSG, EML, PST, MHT, OST
JPG, PNG, ePub... OFT...

Aspose.Slides Aspose.Imaging
PPT, POT, ODP, XPS PDF, BMP, JPG, GIF, TIFF
HTML, PNG, PDF... PNG...

and many more!

Contact Us:
US: +1 903 306 1676
EU: +44 141 628 8900
AU: +61 2 8006 6987 File Format APIs
[email protected]
Working with Files?
Try Aspose File APIs

Convert
Print
Create
Combine
Modify

files from your applications!

Over 15,000 Happy Customers

.NET Java Cloud

Get your FREE evaluation copy at www.aspose.com


Your File Format APIs
Aspose.Cells
Work with spreadsheets and data without depending on Microsoft Excel
• Solution for spreadsheet creation, manipulation and conversion.
• Import and export data.
ASPOSE.CELLS IS A
PROGRAMMING API that allows
developers to create, manipulate
and convert Microsoft Excel
spreadsheet files from within their
own applications. Its powerful
features make it easy to convert
worksheets and charts to graphics
or save
reports to Aspose.Cells lets developers work with data sources, formatting, even formulas.

PDF. Automation. reporting.


Aspose.
A flexible API • Powerful formula engine.
Cells for simple Common Uses • Complete formatting control.
speeds up and complex • Building dynamic reports on Supported File Formats
working spreadsheet the fly.
with • Creating Excel dashboards with XLS, XLSX, XLSM, XMPS, XLTX,
programming. charts and pivot tables. XLTM, ODS, XPS, SpreadsheetML,
Microsoft
Excel • Rendering and printing tab delim., CSV, TXT, PDF, HTML, and
files. The spreadsheets and graphics with many image formats including SVG,
API is a flexible tool for simple high fidelity. TIFF, JPEG, PNG and GIF.
• Exporting data to, or importing
tasks such as file conversion, as Format support varies across platforms.
from, Excel and other
well as complex tasks like building
spreadsheets.
models. Developers control page
• Generating, manipulating and
layout, formatting, charts and editing spreadsheets.
formulas. They can read and write • Converting spreadsheets to
spreadsheet files and save out to a images or other file formats.
wide variety of image and text file
Key Features Platforms
formats.
• A complete spreadsheet
Fast, scalable, and reliable,
manipulation solution.
Aspose.Cells saves time and effort
• Flexible data visualization and
compared to using Microsoft Office

Pricing Info
Standard Enhanced Standard Enhanced
Developer Small Business $999 $1498 Site Small Business $4995 $7490
Developer OEM $2997 $4494 Site OEM $13986 $20972
The pricing info above is for .NET: prices for other platforms may differ. For the latest, contact sales.

www.aspose.com

EU: +44 141 628 8900 US: +1 903 306 1676 Oceania: +61 2 8006 6987
[email protected]

pg 4
Aspose.Cells for
.NET, Java, Cloud & more

File Formats
XLS, CSV, ODS, PDF, SVG, HTML, PNG, BMP, XPS, JPG
SpreadsheetML and many others.

Spreadsheet Manipulation
Aspose.Cells lets you create, import, and export
spreadsheets and also allows you to manipulate contents,
cell formatting, and file protection.

Creating Charts
Aspose.Cells comes with complete support for charting
and supports all standard chart types. Also, you can
convert charts to images.

Graphics Capabilities
Easily convert worksheets to images as well as adding
images to worksheets at runtime.

100% Standalone

Aspose.Cells does not require Microsoft Office to


Get your FREE Trial at be installed on the machine in order to work.
http://www.aspose.com

File Format APIs


Aspose.Words
Program with word processing documents independently of Microsoft Word
• Solution for document creation, manipulation and conversion.
• Advanced mail merge functionality.
ASPOSE.WORDS IS AN
ADVANCED PROGRAMMING
API that lets developers perform
a wide range of document
processing tasks with their own
applications. Aspose.Words
makes it possible to generate,
modify, convert, render and print
documents without Microsoft
Aspose.Words has sophisticated controls for formatting and managing tables and other
Word. It provides sophisticated and content.
flexible access to, and control over,
Common Uses Key Features
Microsoft
Word files. • Generating reports with • A complete Microsoft Word
complex mail merging; mail document manipulation
Aspose. Generate, merging images. solution.
Words is modify, convert, • Populating tables and • Extensive mail merge features.
powerful, • Complete formatting control.
render and print documents with data from a
user- database. • High-fidelity conversion,
friendly documents • Inserting formatted text, rendering and printing.
and without paragraphs, tables and Supported File Formats
feature Microsoft Word. images into Microsoft Word
rich. It documents. DOC, DOCX, ODT, OOXML, XML,
saves • Adding barcodes to HTML, XHTML, MHTML, EPUB, PDF,
developers time and effort documents. XPS, RTF, and a number of image
compared to using Microsoft Office • Inserting diagrams and formats, including TIFF, JPEG, PNG
Automation and makes gives them watermarks into Word and GIF.
powerful document management documents. Format support varies across
tools. • Formatting date and numeric
platforms.
fields.
Aspose.Words makes creating, Platforms
changing and converting DOC and
other word processing file formats
fast and easy.

Pricing Info
Standard Enhanced Standard Enhanced
Developer Small Business $999 $1498 Site Small Business $4995 $7490
Developer OEM $2997 $4494 Site OEM $13986 $20972
The pricing info above is for .NET: prices for other platforms may differ. For the latest, contact sales.
www.aspose.com

EU: +44 141 628 8900 US: +1 903 306 1676 Oceania: +61 2 8006 6987
[email protected]

pg 6
Case Study: Aspose.Words for .NET
ProHire Staffing - Using Aspose.Words for .NET to convert Word Docs
to HTML
PROHIRE IS THE WORKFORCE difference between the Aspose. Implementation
SOLUTIONS LEADER Words and other products was the Once we had the Aspose DLL our
IN THE UNITED STATES obvious conversion quality from MS developer was able to implement
AND SPECIALIZE IN THE Word to HTML. Aspose.Words for .NET in a few
RECRUITMENT OF SALES Finding a Solution hours. The transitions with Aspose.
AND SALES MANGEMENT Words for .NET was very painless to
We had tested other products that
PROFESSIONALS. We were do.
converted Word to HTML. Every one
founded with the goal of becoming
we tested had some problem with Outcome
the premier provider of executive
the conversion. Some of them lost
search and placement services to We are very pleased with the
elements of the resume
the Fortune 500 and success of our Aspose.Words for
during the conversion.
Inc. 500 Companies. .NET implementation. Aspose.Words
Most of them changed
Problem the format of the resume is a very powerful development tool
“The transitions that is well documented and easy to
ProHire uses or changed the color of
Bullhorn ATS as its
with Aspose. the text unexpectedly. install. The documentation is easy
Words for This is unacceptable to understand and use. If you want
Application Tracking
.NET was very when you are sending a product to convert Word Docs
System to track the
to HTML look no further. ProHire is
electronic handling painless to do.” a resume to a hiring
manger. We were happy to recommend Aspose.
of its recruitment
needs. We wanted very satisfied with the
to integrate the results. We did not need
any technical support because This is an extract from a case study on
Bullhorn API with our new website.
documentation was sufficient to us. our website. For the full version, go to:
Our goal was to convert MS Word
www.aspose.com/corporate/
Documents resumes into a clean
customers/case-studies.aspx
and concise HTML format into our
existing .Net Stack. The converted
HTML resume version needed to
look close to the original.
Looking for a Solution
We chose the ASPOSE.Words
product because it easily integrated
into our existing .Net stack, and
provided a quality MS Word to
HTML conversion. The product
was easy to download, and with a
few lines of code we were up and
running. We found the primary The converted HTML resume version needed to look close to the original.

www.aspose.com

EU: +44 141 628 8900 US: +1 903 306 1676 Oceania: +61 2 8006 6987
[email protected]

pg 7
Open, Create, Convert, Print
& Save Files
from within your own applications.

ASPOSE.TOTAL
allows you to process these file formats:

• Word documents
• Excel spreadsheets
• PowerPoint presentations
• PDF documents
• Project documents
• Visio documents
• Outlook emails
• OneNote documents

DOC XLS PPT PDF EML


PNG XML RTF HTML VSD
BMP & barcode images.

Contact Us:
US: +1 903 306 1676
EU: +44 141 628 8900
AU: +61 2 8006 6987
File Format APIs [email protected]
Helped over 11,000 companies and over 300,000 users work with
documents in their applications.

.NET, Java, and Cloud

File Format APIs

GET STARTED NOW

• Free Trial
• 30 Day Temp License
• Free Support
• Community Forums
• Live Chat
• Blogs
• Examples
• Video Demos
Adding File Conversion and Manipulation to
Business Systems
How often do people in your organization complain that they can’t get information in the file format
and layout they want? Converting documents from one format to another without losing layout and
formatting should be simple, but it can be frustrating for both users and developers.

EXTRACTING DATA FROM A Automation lets you use Microsoft barcodes and OCR. The APIs are
DATABASE AND DELIVERING Office programs server-side. It is optimised for stability, speed and
IT TO THE SALES TEAM AS A not how the Office products were ease of use. Our APIs save users
REPORT, complete with charts and designed to be used. It can work weeks, sometimes months, of effort.
corporate branding, is fine. Until the well but you might notice issues
sales team says that they want it as with the stability, security and
a Microsoft Excel file, speed of the system,
and could you add a as well as cost.
dashboard? Using an API: The
Aspose creates
Using information API market has lots of
APIs that work free and commercial
from online forms in
letters that can are
independently solutions, some
printed and posted of Microsoft very focused, some
is easy. But what if Office feature-rich. An API
you also want to add Automation. integrates with your
tracking barcodes and code and gives you
archive a digital copy access to a range of
Finding the Right Tool
as a PDF? new features.
To find the product that’s right for
Ensuring that your business system Look to Aspose you, take a systematic approach:
supports all the different Microsoft
Aspose are API experts. We create • List must-have and nice-to-
Office file formats your users want
APIs, components and extensions have features.
can be difficult. Sometimes the
that work independently of • Research the market.
native file format support of your
Microsoft Automation to extend • Ask for recommendations.
system lets you down. When that is
a platform’s native file format • Select a few candidates .
the case, use tools that extend that • Run trials.
manipulation capabilities.
capability. A good tool can save you • Evaluate
time and effort. Aspose have developed APIs for
• ease of use,
.NET, Java, Cloud and Android that
Document Conversion Options • support and
lets developers convert, create and
documentation,
Building your own solution: Time- manipulate Microsoft Office files –
• performance, and
consuming and costly, this option Microsoft Word, Excel, PowerPoint,
Visio and Project – and other
• current and future
is only sensible if the solution you
popular business formats, from needs.
develop is central to your business.
PDFs and images to emails. We also
Using Microsoft Office
have APIs for working with images,
Automation: Microsoft Office

www.aspose.com

EU: +44 141 628 8900 US: +1 903 306 1676 Oceania: +61 2 8006 6987
[email protected]

pg 10
Aspose.BarCode
A complete toolkit for barcode generation and recognition
• Generate barcodes with customer defined size and color.
• Recognize a large number of barcode types from images.
ASPOSE.BARCODE IS A EXIP and ICON.
ROBUST AND RELIABLE Format support varies across platforms.
BARCODE GENERATION
AND RECOGNITION API that Supported Barcodes
allows developers to add barcode Linear: EAN13, EAN8, UPCA, UPCE,
generation and recognition Interleaved2of5, Standard2of5, MSI,
functionality to their applications Code11, Codabar, EAN14(SCC14),
quickly and easily. SSCC18, ITF14, Matrix 2 of 5, PZN,
Aspose.BarCode offers a large number of
Aspose.BarCode supports most symbologies and formatting options. Code128, Code39 Extended,
established barcode specifications. Code39 Standard, OPC, Code93
It can export generated barcodes to clean up difficult to read images to Extended, Code93 Standard,
multiple image formats, including improve recognition. IATA 2 of 5, GS1Code128, ISBN,
BMP, GIF, JPED, PNG and TIFF. ISMN, ISSN, ITF6, Pharmacode,
Common Uses DatabarOmniDirectional, VIN,
Aspose.
• Generating and recognizing DatabarTruncated, DatabarLimited,
BarCode
barcode images. DatabarExpanded, PatchCode,
gives Robust and
• Printing barcode labels. Supplement 2D: PDF417,
you full reliable barcode • Enhancing workflow by adding MacroPDF417, DataMatrix, Aztec,
control generation and barcode functionality.
over every QR, Italian Post 25, Code16K,
recognition. • Using recognition functions to GS1DataMatrix Postal: Postnet,
aspect drive real-life work processes.
of the Planet, USPS OneCode, Australia
barcode
Key Features Post, Deutsche Post Identcode,
image, from background and • Barcode generation and AustralianPosteParcel, Deutsche
bar color, through image quality, recognition. Post Leticode, RM4SCC,
rotation angle, X-dimension, • Comprehensive support for 1D SingaporePost, SwissPostParcel
captions, and resolution. and 2D symbologies.
• Image processing for improved
Aspose.BarCode can read and recognition. Platforms
recognize most common 1D and
2D barcodes from any image and at
Supported File Formats
any angle. Filters help developers JPG, TIFF, PNG, BMP, GIF, EMF, WMF,

Pricing Info
Standard Enhanced Standard Enhanced
Developer Small Business $599 $1098 Site Small Business $2995 $5490
Developer OEM $1797 $3294 Site OEM $8386 $15372
The pricing info above is for .NET: prices for other platforms may differ. For the latest, contact sales.

www.aspose.com

EU: +44 141 628 8900 US: +1 903 306 1676 Oceania: +61 2 8006 6987
[email protected]

pg 11
Aspose for Cloud

The easiest API to


Create, Convert & Automate Documents in the cloud.

Aspose.Words Aspose.Cells
for Cloud for Cloud

Create and convert docs Create spreadsheets


Manipulate text Convert spreadsheets
Render documents Manipulate cells and
Annotate formulas
Render spreadsheets

Aspose.Slides Aspose.Pdf
for Cloud for Cloud

Create presentations Create and convert PDFs


Convert Manage slides
Edit text and images
Manipulate text, images
Add pages, split, encrypt
Create Read and convert Manage stamps

Render Aspose.Email Aspose.BarCode


for Cloud for Cloud
Combine
Create, update, and Generate barcodes
Modify convert messages
Extract attachments
Read barcodes
Set attributes
Use with any language Multiple image formats
without installing anything!

Free Evaluation at www.aspose.com

• +44 141 628 8900 • +1 903 306 1676 • +61 2 8006 6987
[email protected]
Aspose.Email
Work with emails and calendars without Microsoft Outlook
• Complete email processing solution.
• Message file format support.
ASPOSE.EMAIL IS AN EMAIL
PROGRAMMING API that allows
developers to access and work
with PST, EML, MSG and MHT files.
It also offers an advanced API for
interacting with enterprise mail
systems like Exchange and Gmail.
Aspose.Email can work with HTML
and plain text emails, attachments
and embedded OLE objects. Aspose.Email lets your applications work with emails, attachments, notes and calendars.
It allows
Common Uses Key Features
developers to
work against Aspose. • Sending email with HTML • A complete email processing
SMTP, POP, FTP Email works formatting and attachments. solution.
• Mail merging and sending mass • Support for MSG and PST
and Microsoft with HTML mail. formats.
Exchange and plain • Connecting to POP3 and • Microsoft Exchange Server
servers. It
supports mail
text emails, IMAP mail servers to list and support.
attachments download messages. • Complete recurrence pattern
merge and • Connecting to Microsoft solution.
iCalendar and embedded Exchange Servers to list,
features, OLE objects. Supported File Formats
download and send messages.
customized • Create and update tasks using MSG, MHT, OST, PST, EMLX, TNEF,
header and body, searching archives iCalendar. and EML.
and has many other useful features. • Load from and save messages Format support varies across platforms.
Aspose.Email allows developers to to file or stream (EML, MSG or
MHT formats).
focus on managing email without
getting into the core of email and Platforms
network programming. It gives you
the controls you need.

Pricing Info
Standard Enhanced Standard Enhanced
Developer Small Business $799 $1298 Site Small Business $3995 $6490
Developer OEM $2397 $3894 Site OEM $11186 $18172
The pricing info above is for .NET: prices for other platforms may differ. For the latest, contact sales.
www.aspose.com

EU: +44 141 628 8900 US: +1 903 306 1676 Oceania: +61 2 8006 6987
[email protected]

pg 13
Aspose.Pdf
Create PDF documents without using Adobe Acrobat
• A complete solution for programming with PDF files.
• Work with PDF forms and form fields.
ASPOSE.PDF IS A PDF
DOCUMENT CREATION AND
MANIPULATION API that
developers use to read, write and
manipulate PDF documents without
using Adobe Acrobat. Aspose.
Pdf is a sophisticated product that
integrates with your application to
add PDF capabilities.
Aspose.Pdf offers a wealth of
features that lets developers
compress files, create tables, work Aspose.Pdf can be used to automatically complete PDF forms with external data.
with links,
add and extract or inset pages, and print Key Features
remove PDF documents. • PDF creation from XML or XSL-
security,
Read, write and
FO documents.
manipulate PDF Common Uses
handle • PDF form and field support.
custom documents • Creating and editing PDF files. • Advanced security and
fonts, independently • Inserting, extracting, encryption.
integrate appending, concatenating and • High-fidelity printing and
of Adobe splitting PDFs.
with conversion.
external
Acrobat. • Working with text, images, • Supported File Formats
tables, images, headers, and • PDF, PDF/A, PDF/A_1b, PCL, XLS-
data
footers. FO, LaTeX, HTML, XPS, TXT and
sources, • Applying security, passwords a range of image formats.
manage bookmarks, create table of and signatures. Format support varies across platforms.
contents, create forms and manage • Working with forms and form
form fields. fields. Platforms
It helps developers add, work with
attachments, annotations and PDF
form data, add, replace or remove
text and images, split, concatenate,

Pricing Info
Standard Enhanced Standard Enhanced
Developer Small Business $999 $1498 Site Small Business $4495 $6990
Developer OEM $2997 $4494 Site OEM $13986 $20972
The pricing info above is for .NET: prices for other platforms may differ. For the latest, contact sales.
www.aspose.com

EU: +44 141 628 8900 US: +1 903 306 1676 Oceania: +61 2 8006 6987
[email protected]

pg 14
Aspose.Pdf
.Net, Java & Cloud

File Formats
PDF DOC XML XSL-FO XPS HTML BMP JPG PNG
ePUB & other image file formats.

Create and Manipulate PDFs


Create new or edit/manipualte existing PDFs.

Form Field Features


Add form fields to your PDFs. Import and export form
fields data from select file formats.

Table Features
Add tables to your PDFs with formatting such as table
border style, margin and padding info, column width and
spanning options, and more.

Get started today at www.aspose.com


Aspose.Note for .NET
Aspose.Note for .NET is an API that lets developers convert Microsoft OneNote pages
to a variety of file formats, and extract the text and document information.

Conversion is fast and high-fidelity. The output looks like the OneNote page, no mat-
ter how complex the formatting or layout.

Aspose.Note works independently of Office Automation and does not require Microsoft
Office or OneNote to be installed.

Modify, convert, render and extract text and images from


Microsoft OneNote files without relying on OneNote or
other libraries.

Features
File Formats and Conversion Rendering and Printing Document Management

Microsoft OneNote Load, Save as Image • Extract text


2010, 2010 SP1, Save (BMP, GIF, JPG, PNG) • Get the number of pages in
2013 a document.
• Get page information.
PDF Save Save as PDF • Extract images.
• Get image information from
a document.
• Replace text in document.
Images (BMP, GIF, Save
JPG, PNG)
Aspose.Imaging
Create Images from scratch.
• Load existing images for editing purposes.
• Render to multiple file formats.
ASPOSE.IMAGING IS A CLASS
LIBRARY that facilitates the
developer to create Image files
from scratch or load existing ones
for editing purpose. Also, Aspose.
Imaging provides the means to
save the created or edited Image
to a variety of formats. All of the Aspose.Imaging allows creation and manipulation of images.
above mentioned can be achieved
without the need of an Image
Editor. It works independent of Key Features
draw on Image surface either
other applications and although • Create, edit, and save images
Aspose.Imaging allows you to
by manipulating the bitmap
information or by using the • Multiple file formats
save to Adobe PhotoShop® format • Drawing features
(PSD), you do not need PhotoShop advanced functionality like
• Export images
installed on the machine. Graphics and Paths.
Supported File Formats
Aspose.Imaging is flexible, stable
BMP, JPG, TIFF, GIF, PNG, PSD, DXF,
and powerful. It’s many features
Common Uses DWG, and PDF.
and image
processing • Create images from scratch.
routines Create images • Load and Edit existing images.
should meet • Export images to a variety of
from scratch. formats.
most imaging
requirements. or load • Adding watermark to images.
Like all Aspose existing ones... • Export CAD drawings to PDF &
file format raster image formats.
• Crop, resize & RotateFlip
components,
images.
Aspose. Platforms
• Extract frames from multipage
Imaging introduces support TIFF image.
for an advanced set of drawing
features along with the core
functionality. Developers can
Pricing Info
Standard Enhanced Standard Enhanced
Developer Small Business $399 $898 Site Small Business $1995 $4490
Developer OEM $1197 $2694 Site OEM $5586 $12572
The pricing info above is for .NET.
www.aspose.com

EU: +44 141 628 8900 US: +1 903 306 1676 Oceania: +61 2 8006 6987
[email protected]

pg 17
Aspose.Slides
Work with presentations without using Microsoft PowerPoint
• Complete solution for working with presentation files.
• Export presentations and slides to portable or image formats.
ASPOSE.SLIDES IS A FLEXIBLE
PRESENTATION MANAGEMENT
API that helps developers read,
write and manipulate Microsoft
PowerPoint documents. Slides
and presentations can be saved to
PDF, HTML and image file formats
without Microsoft PowerPoint.
Aspose.Slides offers a number of
advanced
features Aspose.Slides has advanced features for working with every aspect of a presentation.
that make Aspose.Slides
it easy to
Common Uses external content.
gives you the • Wide support for input and
perform • Creating new slides and cloning
tools you need existing slides from templates. output file formats.
tasks
such as to work with • Handling text and shape Supported File Formats
rendering presentation formatting.
PPT, HTML, POT, PPS, PPTX, POTX,
• Applying and removing
slides, files. PPSX, ODP, PresentationML, XPS,
protection.
exporting PDF and image formats including
• Exporting presentations to
images and PDF. TIFF and JPG.
presentations, exporting slides to
• Embedding Excel charts as OLE Format support varies across
SVG and printing. Developers use objects. platforms.
Aspose.Slides to build customizable • Generate presentations from
slide decks, add or remove standard database.
graphics and automatically publish
presentations to other formats.
Key Features
• A complete presentation
Aspose.Slides gives developers Platforms
development solution.
the tools they need to work with
• Control over text, formatting
presentation files. It integrates and slide elements.
quickly and saves time and money. • OLE integration for embedding

Pricing Info
Standard Enhanced Standard Enhanced
Developer Small Business $799 $1298 Site Small Business $3995 $6490
Developer OEM $2397 $3894 Site OEM $11186 $18172
The pricing info above is for .NET: prices for other platforms may differ. For the latest, contact sales.
www.aspose.com

EU: +44 141 628 8900 US: +1 903 306 1676 Oceania: +61 2 8006 6987
[email protected]

pg 18
Support Services
Get the assistance you need, when you need it, from the people who know our products best.
• Free support for all, even when evaluating
• Get the level of support that suits you and your team
NO ONE KNOWS OUR Enterprise
PRODUCTS AS WELL AS WE DO. Enterprise customers often have
We develop them, support them and very specific needs. Our Enterprise
use them. Our support is handled Support option gives them access
through our support forums and is to the product development team
available to all Aspose users. and influence over the roadmap.
Work with the developers that developed

Support
and continue to maintain our products. Enterprise Support customers have
their own, dedicated issue tracking
We are developers ourselves and system.
understand how frustrating it is
when a technical issue or a quirk in Sponsored
the software stops you from doing Support Options Available to Enterprise customers
what you need to do. This is why that would like to request features,
Free
we offer free this higher prioritized support can
support. Everyone who uses Aspose ensure your needed features are
Anyone products have access to our free on our roadmap. A member of
who uses Everyone who support. Our software developers our team will produce a feature
our product, uses Aspose are on stand-by to help you specification document to capture
whether they products have succeed with your project, from your requirements and how we
have bought the evaluation to roll-out of your intend to fulfill them so the direction
them or are
access to our solution. development will take is clear up-
using an free support. front.
evaluation,
deserves our Priority
full attention and respect. We have
If you want to know when you’ll hear
four levels of support that can fit
back from us on an issue and know
your needs.
that your issue is prioritized, Priority
Support is for you. It provides a more
formal support structure and has its
own forum that is monitored by our
software engineers.

Pricing Info
To see the Priority and Enterprise support rates, refer to the product price list, or contact our sales team.
Sponsored Support is unique so pricing is specific to each project. Please contact our sales team to discuss.

www.aspose.com

EU: +44 141 628 8900 US: +1 903 306 1676 Oceania: +61 2 8006 6987
[email protected]

pg 19
We’re Here
to Help You

Aspose has 4 Support Services to best suit your needs

Free Support Support Forums with no Charge Technical Support is an


issue that Aspose takes very
seriously. Software must
Priority Support 24 hour response time in the week, work quickly and depend-
issue escalation, dedicated forum ably. When problems arise,
developers need answers in
Enterprise Support Communicate with product a hurry. We ensure that our
managers, influence the roadmap
clients receive useful answers
and solutions quickly.
Sponsored Support Get the feature you need built now

Email • Live Chat • Forums

Contact Us
US Sales: +1 903 306 1676
[email protected]
EU Sales: +44 141 628 8900
AU Sales: +61 2 8006 6987

File Format APIs

You might also like