@@ -9,14 +9,14 @@ msgid ""
9
9
msgstr ""
10
10
"Project-Id-Version : docker-stacks latest\n "
11
11
"Report-Msgid-Bugs-To : \n "
12
- "POT-Creation-Date : 2019-05-11 14:42 +0000\n "
12
+ "POT-Creation-Date : 2019-06-02 00:19 +0000\n "
13
13
"PO-Revision-Date : YEAR-MO-DA HO:MI+ZONE\n "
14
14
"Last-Translator : FULL NAME <EMAIL@ADDRESS>\n "
15
15
"
Language-Team :
LANGUAGE <[email protected] >\n "
16
16
"MIME-Version : 1.0\n "
17
17
"Content-Type : text/plain; charset=utf-8\n "
18
18
"Content-Transfer-Encoding : 8bit\n "
19
- "Generated-By : Babel 2.6 .0\n "
19
+ "Generated-By : Babel 2.7 .0\n "
20
20
21
21
# 7c56c3891bd94336b21fc82d5aeab6ae
22
22
#: ../../using/common.md:1
@@ -1362,8 +1362,30 @@ msgstr ""
1362
1362
msgid "Apache Spark"
1363
1363
msgstr ""
1364
1364
1365
- # ec077f84f7394baba4071d0d8a9c9dbf
1365
+ # 433d8d99798649029dafd444253567b4
1366
1366
#: ../../using/specifics.md:7
1367
+ msgid "Specific Docker Image Options"
1368
+ msgstr ""
1369
+
1370
+ # 70083fc71521409895897387117748bc
1371
+ #: ../../using/specifics.md:8
1372
+ msgid ""
1373
+ "-p 4040:4040 - The jupyter/pyspark-notebook and jupyter/all-spark-"
1374
+ "notebook images open SparkUI (Spark Monitoring and Instrumentation UI) at"
1375
+ " default port 4040, this option map 4040 port inside docker container to "
1376
+ "4040 port on host machine . Note every new spark context that is created "
1377
+ "is put onto an incrementing port (ie. 4040, 4041, 4042, etc.), and it "
1378
+ "might be necessary to open multiple ports. For example: docker run -d -p "
1379
+ "8888:8888 -p 4040:4040 -p 4041:4041 jupyter/pyspark-notebook"
1380
+ msgstr ""
1381
+
1382
+ # 2449b18de82e4d129cbd49e52ce9e522
1383
+ #: ../../using/specifics.md:10
1384
+ msgid "Usage Examples"
1385
+ msgstr ""
1386
+
1387
+ # ec077f84f7394baba4071d0d8a9c9dbf
1388
+ #: ../../using/specifics.md:12
1367
1389
msgid ""
1368
1390
"The jupyter/pyspark-notebook and jupyter/all-spark-notebook images "
1369
1391
"support the use of Apache Spark in Python, R, and Scala notebooks. The "
@@ -1372,37 +1394,37 @@ msgid ""
1372
1394
msgstr ""
1373
1395
1374
1396
# ca35b3b020914e2eb2e877199a90d4a4
1375
- #: ../../using/specifics.md:9
1397
+ #: ../../using/specifics.md:14
1376
1398
msgid "Using Spark Local Mode"
1377
1399
msgstr ""
1378
1400
1379
1401
# 15a0171869f3437481b9dfb2aec3db00
1380
- #: ../../using/specifics.md:11
1402
+ #: ../../using/specifics.md:16
1381
1403
msgid ""
1382
1404
"Spark local mode is useful for experimentation on small data when you do "
1383
1405
"not have a Spark cluster available."
1384
1406
msgstr ""
1385
1407
1386
1408
# 2c5367b84e444a1aa504910b22ba1454
1387
1409
# 09d8b02687704d368a670cb243e299fb
1388
- #: ../../using/specifics.md:13 ../../using/specifics.md:69
1410
+ #: ../../using/specifics.md:18 ../../using/specifics.md:74
1389
1411
msgid "In a Python Notebook"
1390
1412
msgstr ""
1391
1413
1392
1414
# 2b5c49ea60184570a8b50622140c22e1
1393
1415
# 192add33e94844f080ae03254899e2ee
1394
- #: ../../using/specifics.md:22 ../../using/specifics.md:96
1416
+ #: ../../using/specifics.md:27 ../../using/specifics.md:101
1395
1417
msgid "In a R Notebook"
1396
1418
msgstr ""
1397
1419
1398
1420
# 80cedeb3b4514de792dba8e03b1c8774
1399
1421
# 5f3159414ddc427699709ddef740d6fd
1400
- #: ../../using/specifics.md:34 ../../using/specifics.md:117
1422
+ #: ../../using/specifics.md:39 ../../using/specifics.md:122
1401
1423
msgid "In a Spylon Kernel Scala Notebook"
1402
1424
msgstr ""
1403
1425
1404
1426
# 07c9256c669b488aaa2df48676d5a188
1405
- #: ../../using/specifics.md:36
1427
+ #: ../../using/specifics.md:41
1406
1428
#, python-format
1407
1429
msgid ""
1408
1430
"Spylon kernel instantiates a SparkContext for you in variable sc after "
@@ -1411,42 +1433,42 @@ msgstr ""
1411
1433
1412
1434
# aeec453983524d3ab59f7241cf8bac7b
1413
1435
# 472a48e72aaf46ca86a89e1598595045
1414
- #: ../../using/specifics.md:50 ../../using/specifics.md:132
1436
+ #: ../../using/specifics.md:55 ../../using/specifics.md:137
1415
1437
msgid "In an Apache Toree Scala Notebook"
1416
1438
msgstr ""
1417
1439
1418
1440
# 762d164260cd4938b5f9556b29b0e171
1419
- #: ../../using/specifics.md:52
1441
+ #: ../../using/specifics.md:57
1420
1442
msgid ""
1421
1443
"Apache Toree instantiates a local SparkContext for you in variable sc "
1422
1444
"when the kernel starts."
1423
1445
msgstr ""
1424
1446
1425
1447
# 8da1ef6876324b61885c5dec2c6a9cbf
1426
- #: ../../using/specifics.md:59
1448
+ #: ../../using/specifics.md:64
1427
1449
msgid "Connecting to a Spark Cluster on Mesos"
1428
1450
msgstr ""
1429
1451
1430
1452
# 4926e921fbd24baba9888b3f08cf4f51
1431
- #: ../../using/specifics.md:61
1453
+ #: ../../using/specifics.md:66
1432
1454
msgid "This configuration allows your compute cluster to scale with your data."
1433
1455
msgstr ""
1434
1456
1435
1457
# e8c29961728146a28c6581966a2d2341
1436
- #: ../../using/specifics.md:63
1458
+ #: ../../using/specifics.md:68
1437
1459
msgid "Deploy Spark on Mesos."
1438
1460
msgstr ""
1439
1461
1440
1462
# 1838d7f4481246538ca4ffe89e02ff4d
1441
- #: ../../using/specifics.md:64
1463
+ #: ../../using/specifics.md:69
1442
1464
msgid ""
1443
1465
"Configure each slave with the --no-switch_user flag or create the "
1444
1466
"$NB_USER account on every slave node."
1445
1467
msgstr ""
1446
1468
1447
1469
# d4ee49cc6cb547389ed3228e74a4a67c
1448
1470
# 4b8c4c1e7ea441f1af4b4e0fbed73888
1449
- #: ../../using/specifics.md:65 ../../using/specifics.md:161
1471
+ #: ../../using/specifics.md:70 ../../using/specifics.md:166
1450
1472
msgid ""
1451
1473
"Run the Docker container with --net=host in a location that is network "
1452
1474
"addressable by all of your Spark workers. (This is a Spark networking "
@@ -1455,20 +1477,20 @@ msgstr ""
1455
1477
1456
1478
# 9a026387155e46fa8e4e1ea3f00d3a63
1457
1479
# 68e479d8f50e4685a0fb5de56a978347
1458
- #: ../../using/specifics.md:66 ../../using/specifics.md:162
1480
+ #: ../../using/specifics.md:71 ../../using/specifics.md:167
1459
1481
msgid ""
1460
1482
"NOTE: When using --net=host, you must also use the flags --pid=host -e "
1461
1483
"TINI_SUBREAPER=true. See https://github.com/jupyter/docker-"
1462
1484
"stacks/issues/64 for details."
1463
1485
msgstr ""
1464
1486
1465
1487
# 16c4327879294075a64b4329f972321c
1466
- #: ../../using/specifics.md:67
1488
+ #: ../../using/specifics.md:72
1467
1489
msgid "Follow the language specific instructions below."
1468
1490
msgstr ""
1469
1491
1470
1492
# 929575857ae647aebbcb721af39bdd7e
1471
- #: ../../using/specifics.md:134
1493
+ #: ../../using/specifics.md:139
1472
1494
msgid ""
1473
1495
"The Apache Toree kernel automatically creates a SparkContext when it "
1474
1496
"starts based on configuration information from its command line arguments"
@@ -1478,72 +1500,72 @@ msgid ""
1478
1500
msgstr ""
1479
1501
1480
1502
# 3e3d5ec9fa554e75989856139938f4f8
1481
- #: ../../using/specifics.md:136
1503
+ #: ../../using/specifics.md:141
1482
1504
msgid ""
1483
1505
"For instance, to pass information about a Mesos master, Spark binary "
1484
1506
"location in HDFS, and an executor options, you could start the container "
1485
1507
"like so:"
1486
1508
msgstr ""
1487
1509
1488
1510
# fa8494a4dde544109b9f6f49ac28178f
1489
- #: ../../using/specifics.md:144
1511
+ #: ../../using/specifics.md:149
1490
1512
msgid ""
1491
1513
"Note that this is the same information expressed in a notebook in the "
1492
1514
"Python case above. Once the kernel spec has your cluster information, you"
1493
1515
" can test your cluster in an Apache Toree notebook like so:"
1494
1516
msgstr ""
1495
1517
1496
1518
# da5d5d861e914df98df9dba50fb3d66a
1497
- #: ../../using/specifics.md:155
1519
+ #: ../../using/specifics.md:160
1498
1520
msgid "Connecting to a Spark Cluster in Standalone Mode"
1499
1521
msgstr ""
1500
1522
1501
1523
# 79db0ba4244a4701aa8dfe0053d5579c
1502
- #: ../../using/specifics.md:157
1524
+ #: ../../using/specifics.md:162
1503
1525
msgid ""
1504
1526
"Connection to Spark Cluster on Standalone Mode requires the following set"
1505
1527
" of steps:"
1506
1528
msgstr ""
1507
1529
1508
1530
# 2c728588b6df4753a0c08f969364a79a
1509
- #: ../../using/specifics.md:159
1531
+ #: ../../using/specifics.md:164
1510
1532
msgid ""
1511
1533
"Verify that the docker image (check the Dockerfile) and the Spark Cluster"
1512
1534
" which is being deployed, run the same version of Spark."
1513
1535
msgstr ""
1514
1536
1515
1537
# d5a341bb44524a8cb33f086803daaf63
1516
- #: ../../using/specifics.md:160
1538
+ #: ../../using/specifics.md:165
1517
1539
msgid "Deploy Spark in Standalone Mode."
1518
1540
msgstr ""
1519
1541
1520
1542
# 3c781f06114240e28dcdb0c40a5d5cf5
1521
- #: ../../using/specifics.md:163
1543
+ #: ../../using/specifics.md:168
1522
1544
msgid ""
1523
1545
"The language specific instructions are almost same as mentioned above for"
1524
1546
" Mesos, only the master url would now be something like "
1525
1547
"spark://10.10.10.10:7077"
1526
1548
msgstr ""
1527
1549
1528
1550
# 85baa5bd4ed5426b96dad49dacfab9cb
1529
- #: ../../using/specifics.md:165
1551
+ #: ../../using/specifics.md:170
1530
1552
msgid "Tensorflow"
1531
1553
msgstr ""
1532
1554
1533
1555
# 4249b4b266fc4aeeb85dc8386ab60592
1534
- #: ../../using/specifics.md:167
1556
+ #: ../../using/specifics.md:172
1535
1557
msgid ""
1536
1558
"The jupyter/tensorflow-notebook image supports the use of Tensorflow in "
1537
1559
"single machine or distributed mode."
1538
1560
msgstr ""
1539
1561
1540
1562
# 68fba23f7cd94702a9dead3c51719206
1541
- #: ../../using/specifics.md:169
1563
+ #: ../../using/specifics.md:174
1542
1564
msgid "Single Machine Mode"
1543
1565
msgstr ""
1544
1566
1545
1567
# d4b74babe01d4a3a86c46844a737151b
1546
- #: ../../using/specifics.md:183
1568
+ #: ../../using/specifics.md:188
1547
1569
msgid "Distributed Mode"
1548
1570
msgstr ""
1549
1571
0 commit comments