Modifications

Sauter à la navigation Sauter à la recherche
8 328 octets ajoutés ,  27 décembre 2014 à 23:42
aucun résumé de modification
Ligne 5 : Ligne 5 :  
== Introduction ==
 
== Introduction ==
 
{{bloc-etroit|text=Vous trouverez également la version anglaise de la [http://docs.spark.io/cli/#command-reference référence des commandes CLI sur le site de Spark.IO]}}
 
{{bloc-etroit|text=Vous trouverez également la version anglaise de la [http://docs.spark.io/cli/#command-reference référence des commandes CLI sur le site de Spark.IO]}}
 +
 +
== Login & Wifi ==
 +
=== spark setup wifi ===
 +
Helpful shortcut for adding another wifi network to a core connected over USB. Make sure your core is connected via a USB cable, and is slow blinking blue listening mode
 +
 +
<nowiki># how to just update your wifi settings.
 +
# Make sure your core is connected and in listening mode first
 +
$ spark setup wifi</nowiki>
 +
 +
=== spark login ===
 +
Login and save an access token for interacting with your account on the Spark Cloud.
 +
<nowiki># how to login - creates and saves an access token for your session with the CLI
 +
$ spark login</nowiki>
 +
 +
=== spark logout ===
 +
Logout and optionally revoke the access token for your CLI session.
 +
<nowiki># how to remove your saved access token, and optionally revoke that token as well
 +
$ spark logout</nowiki>
 +
 +
== Gestion des Cores ==
 +
== spark list ===
 +
Generates a list of what cores you own, and displays information about their status, including what variables and functions are available
 +
<nowiki># how to show what cores of yours are online
 +
# and what functions and variables are available
 +
$ spark list
 +
 +
Checking with the cloud...
 +
Retrieving cores... (this might take a few seconds)
 +
my_core_name (0123456789ABCDEFGHI) 0 variables, and 4 functions
 +
  Functions:
 +
    int digitalWrite(string)
 +
    int digitalRead(string)
 +
    int analogWrite(string)
 +
    int analogRead(string)</nowiki>
 +
 +
=== spark core add ===
 +
Adds a new core to your account
 +
<nowiki># how to add a new core to your account
 +
$ spark cloud claim 0123456789ABCDEFGHI
 +
Claiming core 0123456789ABCDEFGHI
 +
Successfully claimed core 0123456789ABCDEFGHI</nowiki>
 +
 +
=== spark core rename ===
 +
Assigns a new name to a core you've claimed
 +
<nowiki># how to change the name of your core
 +
$ spark core rename 0123456789ABCDEFGHI "pirate frosting"</nowiki>
 +
 +
=== spark core remove ===
 +
Removes a core from your account so someone else can claim it.
 +
<nowiki># how to remove a core from your account
 +
$ spark core remove 0123456789ABCDEFGHI
 +
Are you sure?  Please Type yes to continue: yes
 +
releasing core 0123456789ABCDEFGHI
 +
server said  { ok: true }
 +
Okay!</nowiki>
 +
 +
== Flasher ==
 +
Sends a firmware binary, a source file, or a directory of source files, or a known app to your core.
 +
 +
=== Flashing a directory ===
 +
You can setup a directory of source files and libraries for your project, and the CLI will use those when compiling remotely. You can also create <code>spark.include</code> and / or a <code>spark.ignore</code> file in that directory that will tell the CLI specifically which files to use or ignore.
 +
<nowiki># how to compile and flash a directory of source code to your core
 +
$ spark flash 0123456789ABCDEFGHI my_project</nowiki>
 +
 +
=== Flashing one or more source files ===
 +
 +
<nowiki># how to compile and flash a list of source files to your core
 +
$ spark flash 0123456789ABCDEFGHI app.ino library1.cpp library1.h</nowiki>
 +
 +
=== Flashing a known app ===
 +
Two pre-built apps are included with the CLI to help you get back on track. Tinker, and the CC3000 patching app. You can flash these both over the cloud or locally via USB and dfu-util through the CLI.
 +
<nowiki># how to flash a "known app" like tinker, or the cc3000 patcher to your core
 +
$ spark flash 0123456789ABCDEFGHI tinker
 +
$ spark flash 0123456789ABCDEFGHI cc3000
 +
 +
# how to flash if your core is blinking yellow and connected over usb
 +
# requires dfu-util
 +
$ spark flash --usb tinker
 +
$ spark flash --usb cc3000</nowiki>
 +
 +
=== Compiling remotely and Flashing locally ===
 +
To work locally, but use the cloud compiler, simply use the compile command, and then the local flash command after. Make sure you connect your core via USB and place it into [http://docs.spark.io/#/connect/appendix-dfu-mode-device-firmware-upgrade mode dfu] (''anglais, spark'').
 +
<nowiki># how to compile a directory of source code and tell the CLI where to save the results
 +
$ spark compile my_project_folder --saveTo firmware.bin
 +
OR
 +
# how to compile a list of source files
 +
$ spark compile app.ino library1.cpp library1.h --saveTo firmware.bin
 +
 +
# how to flash a pre-compiled binary over usb to your core
 +
# make sure your core is flashing yellow and connected via USB
 +
# requires dfu-util to be installed
 +
$ spark flash --usb firmware.bin</nowiki>
 +
 +
== Compiler ==
 +
Compiles one or more source file, or a directory of source files, and downloads a firmware binary.
 +
=== compiling a directory ===
 +
You can setup a directory of source files and libraries for your project, and the CLI will use those when compiling remotely. You can also create <code>spark.include</code> and / or a <code>spark.ignore</code> file in that directory that will tell the CLI specifically which files to use or ignore. Those files are just plain text with one line per filename
 +
<nowiki># how to compile a directory of source code
 +
$ spark compile my_project_folder</nowiki>
 +
 +
=== example spark.include ===
 +
The spark.include and spark.ignore files are just regular text files with one filename per line. If your directory has one of these files, the CLI will use it to try and determine what to include or ignore when compiling your app.
 +
 +
<nowiki># spark.include
 +
application.cpp
 +
library1.h
 +
library1.cpp</nowiki>
 +
 +
=== example spark.ignore ===
 +
 +
<nowiki># spark.ignore
 +
.ds_store
 +
logo.png
 +
old_version.cpp</nowiki>
 +
 +
=== Compiling one or more source files ===
 +
 +
<nowiki># how to compile a list of source files
 +
$ spark compile app.ino library1.cpp library1.h</nowiki>
 +
 +
== Appels ==
 +
=== spark call ===
 +
Calls a function on one of your cores, use <code>spark list</code> to see which cores are online, and what functions are available.
 +
<nowiki># how to call a function on your core
 +
$ spark call 0123456789ABCDEFGHI digitalWrite "D7,HIGH"
 +
1</nowiki>
 +
 +
=== spark get ===
 +
Retrieves a variable value from one of your cores, use <code>spark list</code> to see which cores are online, and what variables are available.
 +
<nowiki># how to get a variable value from a core
 +
$ spark get 0123456789ABCDEFGHI temperature
 +
72.1</nowiki>
 +
 +
=== spark monitor ===
 +
Pulls the value of a variable at a set interval, and optionally display a timestamp
 +
 +
* Minimum delay for now is 500 (there is a check anyway if you keyed anything less)
 +
* hitting CTRL + C in the console will exit the monitoring
 +
 +
<nowiki># how to poll for a variable value from one or more cores continuously
 +
$ spark monitor 0123456789ABCDEFGHI temperature 5000
 +
$ spark monitor 0123456789ABCDEFGHI temperature 5000 --time
 +
$ spark monitor all temperature 5000
 +
$ spark monitor all temperature 5000 --time
 +
$ spark monitor all temperature 5000 --time > my_temperatures.csv</nowiki>
 +
 +
== Outils ==
 +
=== spark identify ===
 +
Retrieves your core id when the core is connected via USB and in listening mode (flashing blue).
 +
<nowiki># helps get your core id via usb and serial
 +
# make sure your core is connected and blinking blue
 +
$ spark identify
 +
$ spark identify 1
 +
$ spark identify COM3
 +
$ spark identify /dev/cu.usbmodem12345
 +
 +
$ spark identify
 +
0123456789ABCDEFGHI</nowiki>
 +
 +
=== spark subscribe ===
 +
Subscribes to published events on the cloud, and pipes them to the console. Special core name "mine" will subscribe to events from just your cores.
 +
<nowiki># opens a connection to the API so you can stream events coming from your cores
 +
$ spark subscribe
 +
$ spark subscribe mine
 +
$ spark subscribe eventName
 +
$ spark subscribe eventName mine
 +
$ spark subscribe eventName CoreName
 +
$ spark subscribe eventName 0123456789ABCDEFGHI</nowiki>
 +
 +
=== spark serial list ===
 +
Shows currently connected Spark Core's acting as serial devices over USB
 +
<nowiki># shows a list of cores connected via serial usb
 +
$ spark serial list</nowiki>
 +
 +
=== spark serial monitor ===
 +
Starts listening to the specified serial device, and echoes to the terminal
 +
<nowiki># opens a read-only serial monitor for a particular core
 +
$ spark serial monitor
 +
$ spark serial monitor 1
 +
$ spark serial monitor COM3
 +
$ spark serial monitor /dev/cu.usbmodem12345</nowiki>
 +
 +
== spark keys (xxx) ==
 +
CLI intègre plusieurs utilitaires de gestion des clés.
 +
 +
{{ambox|text=Nous n'avons pas traduit le contenu [http://docs.spark.io/cli/#running-from-source-advanced-spark-keys-doctor de ces fonctions que vous trouverez dans la documentation de Spark] (''Spark, anglais'') }}
 +
 +
Les différents fonctions à votre disposition sont:
 +
* spark keys doctor
 +
* spark keys new
 +
* spark keys load
 +
* spark keys save
 +
* spark keys send
 +
* spark keys server
    
{{Spark.IO-CLI-TRAILER}}
 
{{Spark.IO-CLI-TRAILER}}
29 918

modifications

Menu de navigation