suneel marthi - deep learning with apache flink and dl4j
TRANSCRIPT
Deep Learning with Apache Flink and DeepLearning4J
Flink Forward 2016,Berlin, Germany
Suneel Marthi@suneelmarthi
About me•Senior Principal Software Engineer, Office of Technology, Red Hat Inc.
•Member of the Apache Software Foundation
•PMC member on Apache Mahout, Apache Pirk, Apache Incubator
•PMC Chair, Apache Mahout (April 2015 - April 2016)
Outline
● What is Deep Learning?● Overview of DeepLearning4J Ecosystem● Deep Learning Workflows● ETL & Vectorization with DataVec● Apache Flink and DL4J
What is Deep Learning?
Handwriting Recognition
Face Recognition (Facebook)
Image Generation
Self-Driving Cars
DL has been very successful with Image Classification
Dogs v/s Catshttps://www.kaggle.com/c/dogs-vs-cats
● Deep Learning is a series of steps for automated feature extractiono Based on techniques that have been around for several years
o Several techniques chained together to automate feature engineering
o “Deep” due to several interconnected layers of nodes stacked together between the input and the output.
“Deep learning will make you acceptable to the learned; but it is only an obliging and easy behaviour, and entertaining conversation, that will make you agreeable to all companies”
- James Burgh
Popular Deep Neural Networks
● Deep Belief Networkso Most popular architecture
● Convolutional Neural Networkso Successful in image classification
● Recurrent Networkso Time series Analysis
o Sequence Modelling
Deep Learning in Enterprise
● Ability to work with small and big data easilyo Don’t want to change tooling because we moved to Hadoop
● Ability to not get caught up in things like vectorization and ETLo Need to focus on better models
o Understanding your data is very important
● Ability to experiment with lots of models
DeepLearning4J
● “The Hadoop of Deep Learning”o Command line driven
o Java and Scala APIs
o ASF 2.0 Licensed
● Java implementationo Parallelization
o GPU support
Support for multi-GPU per host
● Runtime Neutralo Local, Spark, Flink
o AWS
DL4J Suite of Tools● DeepLearning4J
o Main library for deep learning
● DataVeco Extract, Transform, Load (ETL) and Vectorization library
● ND4Jo Linear Algebra framework
o Swappable backends (JBLAS, GPUs)
o Think NumPy on the JVM
● Arbitero Model evaluation, Hyperparameter Search and testing platform
DL4J: DataVec for Data Ingest and Vectorization
● Uses an Input/Output format
● Supports all major types of Input data (Text, Images, Audio, Video, SVMLight)
● Extensible for Specialized Input Formats
● Interfaces with Apache Kafka
DL4J: ND4J
● Scientific computing library on JVM (think NumPy on JVM)● Supports N-dimensional vector computations● Supports GPUs via CUDA and Native JBlas
Learning Progressive Layers
Deep Learning Workflows
● Data Ingestion and storage.● Data cleansing and transformation.● Split the dataset into Training, Validation and Test Data sets- Apache Flink DataSet API for Data Ingestion and Transformation
Data Ingestion and Munging
DL Model Building ● Build Deep Learning Network and Train with Training Data
● Parameter Averaging
● Test and Validate the Model
● Repeat until satisfied
● Persist and Deploy the Model in Production
Prediction and Scoring
Deployed Model used to make predictions against Streaming data-- Streaming Predictors using Apache Flink DataStream API
DL4J API Example MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder() .seed(12345) .iterations(1) .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT) .learningRate(0.05) .l2(0.001) .list(4) .layer(0, new DenseLayer.Builder().nIn(784).nOut(250) .weightInit(WeightInit.XAVIER) .updater(Updater.ADAGRAD) .activation("relu").build()) .layer(1, new DenseLayer.Builder().nIn(250).nOut(10) .weightInit(WeightInit.XAVIER) .updater(Updater.ADAGRAD) .activation("relu").build()) .layer(2, new DenseLayer.Builder().nIn(10).nOut(250) .weightInit(WeightInit.XAVIER) .updater(Updater.ADAGRAD) .activation("relu").build()) .layer(3, new OutputLayer.Builder().nIn(250).nOut(784) .weightInit(WeightInit.XAVIER) .updater(Updater.ADAGRAD) .activation("relu").lossFunction(LossFunctions.LossFunction.MSE) .build()) .pretrain(false).backprop(true) .build();
Building Deep Learning Workflows
● Flexibility to build / apply the modelo Local
o AWS, Spark, Flink (WIP)
● Convert data from a raw format into a baseline raw vectoro Model the data
o Evaluate the Model
● Traditionally all of these are tied together in one toolo But this is a monolithic pattern
The DL4J Suite of Tools let us do this
Load Existing Models in DL4J
String jsonModelConfig = loadTextFileFromDisk( pathToModelJSON );
MultiLayerConfiguration configFromJson = MultiLayerConfiguration.fromJson( jsonModelConfig );
FSDataInputStream hdfsInputStream_ModelParams = hdfs.open(new Path( hdfsPathToModelParams ));
try (DataInputStream dis = new DataInputStream( hdfsInputStream_ModelParams )) { INDArray newParams = Nd4j.read( dis );}
MultiLayerNetwork network = new MultiLayerNetwork( configFromJson );network.init();network.setParameters(newParams);
Vectorizing Data - Iris Data Set
5.1,3.5,1.4,0.2,Iris-setosa4.9,3.0,1.4,0.2,Iris-setosa4.7,3.2,1.3,0.2,Iris-setosa7.0,3.2,4.7,1.4,Iris-versicolor
vectorized to
0.0 1:0.1666666666666665 2:1.0 3:0.021276595744680823 4:0.00.0 1:0.08333333333333343 2:0.5833333333333334 3:0.021276595744680823 4:0.00.0 1:0.0 2:0.7500000000000002 3:0.0 4:0.01.0 1:0.9583333333333335 2:0.7500000000000002 3:0.723404255319149 4:0.5217391304347826
DataVec - Command Line Vectorization
● Library of tools to vectorize - Audio, Video, Image, Text, CSV, SVMLight
● Convert the input data into vectors in a standardized format (SVMLight, Text, CSV etc)o Adaptable with custom input/output formats
● Open Source, ASF 2.0 Licensedo https://github.com/deeplearning4j/DataVec
o Part of DL4J suite
Workflow Configuration (iris_conf.txt)
canova.input.header.skip=falsecanova.input.statistics.debug.print=falsecanova.input.format=org.canova.api.formats.input.impl.LineInputFormat
canova.input.directory=src/test/resources/csv/data/uci_iris_sample.txtcanova.input.vector.schema=src/test/resources/csv/schemas/uci/iris.txtcanova.output.directory=/tmp/iris_unit_test_sample.txt
canova.output.format=org.canova.api.formats.output.impl.SVMLightOutputFormat
Iris Canova Vector Schema
@RELATION UCIIrisDataset@DELIMITER ,
@ATTRIBUTE sepallength NUMERIC !NORMALIZE @ATTRIBUTE sepalwidth NUMERIC !NORMALIZE @ATTRIBUTE petallength NUMERIC !NORMALIZE @ATTRIBUTE petalwidth NUMERIC !NORMALIZE @ATTRIBUTE class STRING !LABEL
Model Iris using Canova Command Line
./bin/canova vectorize -conf /tmp/iris_conf.txt Output vectors written to: /tmp/iris_svmlight.txt
./bin/dl4j train –conf /tmp/iris_conf.txt
[ …log output… ]
./bin/arbiter evaluate –conf /tmp/iris_conf.txt
[ …log output… ]
DL4J + Apache Flink
• Apache Flink support for Dl4J : DataVec (In progress)
• Streaming Predictors using Flink : Kafka (In progress)
• Possible
Present DL4J – Flink work in progress
• Support for DL4J : DataVec • Streaming Predictions with Apache Flink
Future Work
• Flink support for DL4J: Arbiter for Hyperparameter Search
• Flink support for DeepLearning4J to be able to build MultiLayer DL configurations.
https://github.com/deeplearning4j
Credits
Skymind.io Team
• Adam Gibson
• Chris V. Nicholson
• Josh Patterson
Questions ???