<?xml version="1.0" encoding="utf-8"?>
<rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:og="http://ogp.me/ns#" xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#" xmlns:schema="http://schema.org/" xmlns:sioc="http://rdfs.org/sioc/ns#" xmlns:sioct="http://rdfs.org/sioc/types#" xmlns:skos="http://www.w3.org/2004/02/skos/core#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" version="2.0" xml:base="https://www.linuxjournal.com/">
  <channel>
    <title>ONNX</title>
    <link>https://www.linuxjournal.com/</link>
    <description/>
    <language>en</language>
    
    <item>
  <title>ONNX: the Open Neural Network Exchange Format</title>
  <link>https://www.linuxjournal.com/content/onnx-open-neural-network-exchange-format</link>
  <description>  &lt;div data-history-node-id="1339771" class="layout layout--onecol"&gt;
    &lt;div class="layout__region layout__region--content"&gt;
      
            &lt;div class="field field--name-node-author field--type-ds field--label-hidden field--item"&gt;by &lt;a title="View user profile." href="https://www.linuxjournal.com/user/800928" lang="" about="https://www.linuxjournal.com/user/800928" typeof="schema:Person" property="schema:name" datatype="" xml:lang=""&gt;Braddock Gaskill&lt;/a&gt;&lt;/div&gt;
      
            &lt;div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"&gt;&lt;p&gt;&lt;em&gt;
An open-source battle is being waged for the soul of artificial
intelligence. It is being fought by industry titans, universities and
communities of machine-learning researchers world-wide. This article
chronicles one small skirmish in that fight: a standardized file format
for neural networks. At stake is the open exchange of data among a
multitude of tools instead of competing monolithic frameworks.
&lt;/em&gt;&lt;/p&gt;



&lt;p&gt;
The good news is that the battleground is Free and Open. None of the
big players are pushing closed-source solutions. Whether it is Keras and
Tensorflow backed by Google, MXNet by Apache endorsed by Amazon, or Caffe2
or PyTorch supported by Facebook, all solutions are open-source software.
&lt;/p&gt;
&lt;p&gt;
Unfortunately, while these projects are &lt;em&gt;open&lt;/em&gt;, they are not
&lt;em&gt;interoperable&lt;/em&gt;. Each framework constitutes a complete stack that
until recently could not interface in any way with any other framework.
A new industry-backed standard, the Open Neural Network Exchange format,
could change that.
&lt;/p&gt;

&lt;p&gt;
Now, imagine a world where you can train a neural network in Keras,
run the trained model through the NNVM optimizing compiler and
deploy it to production on MXNet. And imagine that is just one of
countless combinations of interoperable deep learning tools, including
visualizations, performance profilers and optimizers. Researchers and
DevOps no longer need to compromise on a single toolchain that provides
a mediocre modeling environment and so-so deployment performance.
&lt;/p&gt;

&lt;p&gt;
What is required is a standardized format that can express any machine-learning model and store trained parameters and weights, readable and
writable by a suite of independently developed software.
&lt;/p&gt;

&lt;p&gt;
Enter the &lt;a href="http://onnx.ai"&gt;Open Neural Network Exchange
Format&lt;/a&gt; (ONNX).
&lt;/p&gt;

&lt;span class="h3-replacement"&gt;
The Vision&lt;/span&gt;

&lt;p&gt;
To understand the drastic need for interoperability with a standard like
ONNX, we first must understand the ridiculous requirements we have for
existing monolithic frameworks.
&lt;/p&gt;

&lt;p&gt;
A casual user of a deep learning framework may think of it as a language
for specifying a neural network. For example, I want 100 input neurons,
three fully connected layers each with 50 ReLU outputs, and a softmax on
the output. My framework of choice has a domain language to specify this
(like Caffe) or bindings to a language like Python with a clear API.
&lt;/p&gt;

&lt;p&gt;
However, the specification of the network architecture is only the tip of
the iceberg. Once a network structure is defined, the framework still
has a great deal of complex work to do to make it run on your CPU or
GPU cluster.
&lt;/p&gt;&lt;/div&gt;
      
            &lt;div class="field field--name-node-link field--type-ds field--label-hidden field--item"&gt;  &lt;a href="https://www.linuxjournal.com/content/onnx-open-neural-network-exchange-format" hreflang="en"&gt;Go to Full Article&lt;/a&gt;
&lt;/div&gt;
      
    &lt;/div&gt;
  &lt;/div&gt;

</description>
  <pubDate>Wed, 25 Apr 2018 14:19:00 +0000</pubDate>
    <dc:creator>Braddock Gaskill</dc:creator>
    <guid isPermaLink="false">1339771 at https://www.linuxjournal.com</guid>
    </item>
<item>
  <title>Now Available: April 2018 issue of Linux Journal</title>
  <link>https://www.linuxjournal.com/content/now-available-april-2018-issue-linux-journal</link>
  <description>  &lt;div data-history-node-id="1339747" class="layout layout--onecol"&gt;
    &lt;div class="layout__region layout__region--content"&gt;
      
            &lt;div class="field field--name-node-author field--type-ds field--label-hidden field--item"&gt;by &lt;a title="View user profile." href="https://www.linuxjournal.com/users/carlie-fairchild" lang="" about="https://www.linuxjournal.com/users/carlie-fairchild" typeof="schema:Person" property="schema:name" datatype="" xml:lang=""&gt;Carlie Fairchild&lt;/a&gt;&lt;/div&gt;
      
            &lt;div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"&gt;&lt;p&gt;&lt;cite&gt;Linux Journal&lt;/cite&gt;'s April issue takes a Deep Dive Into the Cloud. Articles in this issue include:&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;ul&gt;&lt;li&gt;Vendor Lock-in and the Cloud&lt;/li&gt;
	&lt;li&gt;Cloud Computing Basics&lt;/li&gt;
	&lt;li&gt;Complexities of Cloud Billing&lt;/li&gt;
	&lt;li&gt;Multiprocessing in Python&lt;/li&gt;
	&lt;li&gt;Smart-Home Hacks&lt;/li&gt;
	&lt;li&gt;A Talk with OSI President Simon Phipps&lt;/li&gt;
	&lt;li&gt;Tips for Securing Your Cloud Environment&lt;/li&gt;
	&lt;li&gt;Introducing ONNX: the Open Neural Network Exchange Format&lt;/li&gt;
	&lt;li&gt;EU's New Copyright Laws Attack Open Source&lt;/li&gt;
	&lt;li&gt;Write an Adventure Game in the Terminal with nurses&lt;/li&gt;
	&lt;li&gt;Bash Project: Create Dynamic Wallpaper&lt;/li&gt;
	&lt;li&gt;FOSS Project Spotlight: Ravada&lt;/li&gt;
	&lt;li&gt;...and more!&lt;/li&gt;
&lt;/ul&gt;&lt;p&gt; &lt;/p&gt;

&lt;p&gt;And yes, &lt;cite&gt;Linux Journal&lt;/cite&gt; continues boasting as many pages as most technical books, this month’s issue of &lt;cite&gt;Linux Journal&lt;/cite&gt; coming in at a hefty 178.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;Subscribers, you can &lt;a href="https://secure2.linuxjournal.com/pdf/dljdownload.php"&gt;download your April issue&lt;/a&gt; now.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;Not a subscriber? It’s not too late. &lt;a href="http://www.linuxjournal.com/subscribe"&gt;Subscribe today&lt;/a&gt; and receive instant access to this and all back issues since 2010.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;Want to buy a single issue? Buy the April magazine or other single back issues &lt;a href="https://linuxjournalstore.com/collections/back-issues-of-linux-journal/products/april-2018-issue-of-linux-journal"&gt;in the LJ store&lt;/a&gt;.&lt;/p&gt;&lt;/div&gt;
      
            &lt;div class="field field--name-node-link field--type-ds field--label-hidden field--item"&gt;  &lt;a href="https://www.linuxjournal.com/content/now-available-april-2018-issue-linux-journal" hreflang="en"&gt;Go to Full Article&lt;/a&gt;
&lt;/div&gt;
      
    &lt;/div&gt;
  &lt;/div&gt;

</description>
  <pubDate>Mon, 02 Apr 2018 15:17:46 +0000</pubDate>
    <dc:creator>Carlie Fairchild</dc:creator>
    <guid isPermaLink="false">1339747 at https://www.linuxjournal.com</guid>
    </item>

  </channel>
</rss>
