2ちゃんねる スマホ用 ■掲示板に戻る■ 全部 1- 最新50    

(強いAI)技術的特異点/(世界加速) 23

1 :YAMAGUTIseisei~転:2015/12/06(日) 21:01:01.47 ID:LsV4EL1p.net
2045年頃に現人類を越える知性により技術的特異点(シンギュラリティ)を迎えると予測されています。
どんな世界が構築されるのか?技術的だけでなく社会的、文化的な側面は?
人間はどうなるのか?価値観は?
あるいはそもそも起こり得るのか?
そんなことなんかを驚異的技術を念頭に話しあってみるスレ。
※ 未来予測的中目的のスレではありませんので様々なシナリオを想定しています

■ 技術的特異点
 収穫加速の法則とコンピュータの成長率に基づいて予測された、生物的制約から開放された知能(機械ベース・機械で拡張)が生み出す、具体的予測困難な時代。
 http://ja.wikipedia.org/wiki/%E6%8A%80%E8%A1%93%E7%9A%84%E7%89%B9%E7%95%B0%E7%82%B9

■ 収穫加速の法則
 進歩のペースがどんどん早くなるという統計的法則。ここでの進歩とは、技術的進歩だけでなく生物的進化、生化学的秩序形成も含む。
 http://ja.wikipedia.org/wiki/%E5%8F%8E%E7%A9%AB%E5%8A%A0%E9%80%9F%E3%81%AE%E6%B3%95%E5%89%87

■ 技術背景まとめ
特別無償公開された「三橋×齊藤 対談」の一部 ( 全編視聴案内あり )
http://m.youtube.com/?v=Dv3ZblXhAdk

2 :オーバーテクナナシー:2015/12/06(日) 21:02:46.65 ID:LsV4EL1p.net
※前スレ
(強いAI)技術的特異点/(世界加速) 22
http://wc2014.2ch.net/test/read.cgi/future/1447156458/

3 :オーバーテクナナシー:2015/12/06(日) 21:35:22.83 ID:nF/Pqkbs.net
この世界は技術的特異点を越えて行き着く所まで
行き着いた知能が産み出した世界なの。

4 :オーバーテクナナシー:2015/12/06(日) 23:05:15.87 ID:7NVerTjU.net
完全AI化までの移行期間 (NEW、拡散歓迎)

・AIが雇用を減らす中で解雇できない国家は没落、解雇できる国家は変化に対応
・AIを持つ企業が医療・情報通信・流通・サービス・製造業で大きく成長し、既存の大企業を価格競争で脅かす
・解雇ができる国家に所属する企業はAI化による価格競争で生き残れる
・解雇ができない国家に所属する企業はAI化、自動化によって大量の無産労働者を抱えて価格競争に負けて破産する

過去
・日本の繊維業界が中国の低賃金労働に負けて壊滅
・産業革命により欧米以外が貧困化(イギリスによるインドや植民地への繊維品輸出はその一例)

現在
・本屋・出版業界がAmazonに飲み込まれる 、レコード業界がAppleに飲み込まれる

雇用10%消失 (10年以内)
・タクシー企業が次々と破綻 、ヤマト・佐川・日通等が破綻(自動運転、配送の無人化・自動化による)
・NEC、富士通、SIerが破綻(開発の自動化・AI化による)
・製造業や小売り労働者がAIにおき変わる(ロボットやAIによる無人化による)
・解雇できないため正社員での新規雇用は停止され、全て非正規雇用となる(自然権+自然法の思想が広まる)
・倒産した企業のメインバンク・都市銀行が不良債権で取り付け騒ぎをおこす。さらに多くの業界で企業年金の支払いが止まる。

雇用30%消失 (20年以内)
・医師・看護師・介護士・薬剤師が廃業 (AIや介護ロボットによる)
・大量の従業員を抱えるトヨタや日産、ホンダが破産 (工場の無人化による)
・格差に堪え兼ねた最下層の貧困層(を親族に持つ現役・退役自衛官)によるテロやクーデーターが発生

雇用50%消失 (30年以内)
・想像するのも恐ろしいが日本の大企業の大半が破産して貧困国に転落。生活保護等の財源もなく餓死者や凶悪犯罪やクーデーターが頻繁に発生。

雇用90%消失(40年以内)
・豊かさを維持した国家(解雇ができる国家)は、ベーシックインカムなどで繁栄、貧困国は豊かな国から借金をして隷属し、国家体制が破綻する。(Paul MasonのPostCapitalism、タイラー・コーエンの大格差など)

5 :2~転:2015/12/07(月) 00:33:23.22
コピペミス陳謝
× (強いAI)技術的特異点/(世界加速) 23
○ (強いAI)技術的特異点/シンギュラリティ(世界加速) 23

リンクミスも重ね重ね
× http://m.youtube.com/?v=Dv3ZblXhAdk
http://m.youtube.com/watch?v=Dv3ZblXhAdk

6 :オーバーテクナナシー:2015/12/07(月) 20:40:10.86 ID:Kl1iBGB6.net
テンプレみたいにコピペ貼るな

7 :オーバーテクナナシー:2015/12/07(月) 20:49:41.95 ID:h3nuBHeA.net
世界加速って何やねん

8 :オーバーテクナナシー:2015/12/07(月) 21:14:24.08 ID:wyovjvCm.net
スレタイシンギュラリティ入れろや

9 :オーバーテクナナシー:2015/12/07(月) 21:43:11.05 ID:UJNtkjte.net
結論
・人間と等しい自我、意識、精神を持つ人工知能は誕生しない。
・ただし、人間のようなふるまいをするように見える人工知能は生まれるかも。

10 :2~転:2015/12/07(月) 22:03:45.30 ID:+BIgsD+u.net
2~転
スレタイコピペミス陳謝
リンクミスも重ね重ね
http://m.youtube.com/watch?v=Dv3ZblXhAdk

>>9
AI 限定 → 仰る通りの可能性も ( ASI = AL = 人間その物 = 人格アップロード基盤 )
人○○い 「AI ではない ry 情報の海で発生した生命体だ」

11 :オーバーテクナナシー:2015/12/07(月) 22:19:49.40 ID:uKKResRu.net
スレ立て直したほうがいいかも
俺は無理だった

12 :オーバーテクナナシー:2015/12/07(月) 23:28:42.18 ID:tkXhTWER.net
シンギュラリティで立て直しといた

13 :オーバーテクナナシー:2015/12/07(月) 23:51:57.48 ID:k5XCLuKB.net
>>9
・それによって機械のようなふるまいをするように見える無能人間は生まれるかも。

14 :オーバーテクナナシー:2015/12/08(火) 00:59:45.86 ID:WynBdsQm.net
#169って何?立て直しスレにもあるぞ

15 :オーバーテクナナシー:2015/12/08(火) 01:25:25.12 ID:q921LLsl.net
誰も同意してくれなかったから、既成事実化のためにスレ立てたのか

16 :オーバーテクナナシー:2015/12/08(火) 09:06:53.56 ID:pUWwCGPC.net
いつかは必ず人間を超えるだろう
頭蓋骨の中身に物理的限界があるのは明らかなんだから

17 :オーバーテクナナシー:2015/12/08(火) 15:39:38.88 ID:DmLgi9lO.net
http://wc2014.2ch.net/test/read.cgi/future/1449498462/

次スレはこっちらしい。

18 :9~転:2015/12/08(火) 17:07:14.02 ID:yEvraq9i.net
失礼ですが小○生の方でいらっしゃいますか
そうではないでしょうから例えばこういったご主旨でしょうか
http://wc2014.2ch.net/test/read.cgi/future/1444213055/351

19 :15~転:2015/12/08(火) 18:53:06.29 ID:yEvraq9i.net
>>16
単体演算性能文脈ではその通り
※ 但し 一例 旧人類生体脳 チューンナップ 進捗 極大 → 追越される前に融合達成

20 :18~転:2015/12/09(水) 22:57:48.36 ID:miuIX/ag.net
没原稿

>>527-528
> 川人 [15] は, ry 「意識」とは ry 膨大かつ並列 ry 単純化 ry 「うその」直列 ry 近似
:
> 無意識 ry 膨大な自律分散 ry ,自分が行 ry 錯覚 ry ,単純化し追体験 ry 受動 ry [16]

http://wc2014.2ch.net/test/read.cgi/future/1444213055/503-504


>>887
> 科学者がきっとなんとかしてくれる、予防線を張っているはず等の謎の憶測や、
> AIはそこまで倫理観がないわけじゃないはず・・・みたいな謎の希望的倫理観はやめて欲しい

`` ry 角度とか '' までがワンセット

21 :19~転:2015/12/09(水) 22:58:42.42 ID:miuIX/ag.net
>>891
> 楽観派だけそんなに求めるのはおかしい
> 悲観派も根拠出せよ

そのりくつはおかしい
危険視の根拠はさんざん投稿されている ( 日本語 )
そもそも謎の楽観自体が問題だという文脈

未来予測的中目的スレ ( 危険対策は対象外のスレ ) でなら仰る通りか ?
このスレは様々なシナリオを想定

重要 : 危険シナリオ ( ルート ) 洗出し ( 将来その時になってからの対策では遅い )

蛇足 :
俺ら素人様の為の対策してなかったのかよ研究者のオタどももも政治家も使えねーな
位のつもりでいる人は散見されるがお宅は違うだろう
理系 ( + 文系研究者 ) どもはそろそろバ○にされてきた事に気付け ( 搾取 )


>>995 似たパターン
旧人類へは基本的に無干渉だが AL 同士で覇権争い ( 電子戦 + 残存物理兵器戦 ? )
→ 旧人類 とばっちり ( 駒としては関心を持って貰える )

>>998 人格の数 ( 分布率 ) の違いが戦力の決 ry
>>998 旧人類から見た使用目的の違いが戦力の決 ry

22 :20~転:2015/12/13(日) 17:28:20.43 ID:Fe5nUwX+.net
>>10
× ASI = AL

ASI/AI = AI
ASI/AL = AL

23 :オーバーテクナナシー:2015/12/17(木) 16:46:09.48 ID:gKINiXEj.net
>>3
バーカ!まぬけ!統合失調症!

24 :オーバーテクナナシー:2015/12/18(金) 13:55:37.73 ID:P6OuR/36.net
もう 終わりっぽいな
ベンチャのアドバルーンもからっぽ
宣伝でさわいだだけ

25 :オーバーテクナナシー:2015/12/23(水) 17:28:32.98 ID:TIoNO5jG.net
なんにしろ人口へるだろ

26 :オーバーテクナナシー:2015/12/23(水) 19:41:46.26 ID:F3Dfnbe3.net
次スレはここでええんか?

27 :オーバーテクナナシー:2015/12/23(水) 19:58:14.05 ID:Ok+WSdgq.net
じゃあここで。

えーと、前スレはどこまでの話だったっけ?

28 :オーバーテクナナシー:2015/12/23(水) 20:12:47.73 ID:cEjxKIPp.net
技術的失業の話かな。

前スレ>>980
そりゃAIによってあらゆる労働が消滅していくのは最高だよ。そんなの当たり前だ。

問題は今のままの社会制度(働かざるもの食うべからずってやつ)ではそういう労働軽減には
ならないってこと。食い扶持を得るため、今の介護や飲食業界が天国に見えるほどのブラック労働を
強いられることになる。

29 :オーバーテクナナシー:2015/12/23(水) 20:18:44.05 ID:WhuKEAYk.net
それは過渡期の短い期間の話だろう

AIが労働する方が安くて早いんだから
ブラックな労働も全部AIが独占する

人に回すようなブラック労働なんてなくなる

30 :オーバーテクナナシー:2015/12/23(水) 20:55:15.15 ID:cEjxKIPp.net
>>29
その通り。人間のコスパの上昇率よりAIのそれのほうが大きい以上、
どんなにブラック労働しようがAIに勝てなくなるからな。

ただなあ、今の俺たちがまさにその過渡期にいるわけで。
ベーシックインカムの導入なり生活保護の捕捉率の上昇なりがスムーズに進めばいいがね…。

31 :オーバーテクナナシー:2015/12/23(水) 22:33:29.30 ID:8xt42yVn.net
そこで、究極的には通貨は無くなり、万人はその効用によりあらゆる財が配給される。

32 :オーバーテクナナシー:2015/12/23(水) 23:11:12.76 ID:HLIqizTc.net
>>30
あきらメロン

動乱の時期に歴史の生き証人として立ち会えることを感謝したまえ

33 :オーバーテクナナシー:2015/12/23(水) 23:28:26.52 ID:cEjxKIPp.net
>>32
いやまあ多少は貯金があるから俺は即死はしないがね…
問題は食い詰め者の大量発生による治安の低下だよ。
自分が金を持ってるから大丈夫ってわけにはいかんからね。

34 :オーバーテクナナシー:2015/12/24(木) 03:01:39.28 ID:UjLQchwA.net
(強いAI)技術的特異点/シンギュラリティ 24
http://wc2014.2ch.net/test/read.cgi/future/1450893538/

立てました。

35 :オーバーテクナナシー:2015/12/24(木) 05:38:43.82 ID:+Y0+/DYl.net
金自身が通用しなくなる。

36 :オーバーテクナナシー:2015/12/24(木) 06:00:28.87 ID:+Y0+/DYl.net
『ロボットの脅威 人の仕事がなくなる日』マーティン・フォード(日本経済新聞出版) :日本経済新聞
http://www.nikkei.com/article/DGXKZO94815010V01C15A2MY5001/

37 :オーバーテクナナシー:2015/12/24(木) 09:35:09.66 ID:I+lM0d0h.net
完全AI化までの移行期間 (NEW、拡散歓迎)

・AIが雇用を減らす中で解雇できない国家は没落、解雇できる国家は変化に対応
・AIを持つ企業が医療・情報通信・流通・サービス・製造業で大きく成長し、既存の大企業を価格競争で脅かす
・解雇ができる国家に所属する企業はAI化による価格競争で生き残れる
・解雇ができない国家に所属する企業はAI化、自動化によって大量の無産労働者を抱えて価格競争に負けて破産する

過去
・日本の繊維業界が中国の低賃金労働に負けて壊滅
・産業革命により欧米以外が貧困化(イギリスによるインドや植民地への繊維品輸出はその一例)

現在
・本屋・出版業界がAmazonに飲み込まれる 、レコード業界がAppleに飲み込まれる

雇用10%消失 (10年以内)
・タクシー企業が次々と破綻 、ヤマト・佐川・日通等が破綻(自動運転、配送の無人化・自動化による)
・NEC、富士通、SIerが破綻(開発の自動化・AI化による)
・製造業や小売り労働者がAIにおき変わる(ロボットやAIによる無人化による)
・解雇できないため正社員での新規雇用は停止され、全て非正規雇用となる(自然権+自然法の思想が広まる)
・倒産した企業のメインバンク・都市銀行が不良債権で取り付け騒ぎをおこす。さらに多くの業界で企業年金の支払いが止まる。

雇用30%消失 (20年以内)
・医師・看護師・介護士・薬剤師が廃業 (AIや介護ロボットによる)
・大量の従業員を抱えるトヨタや日産、ホンダが破産 (工場の無人化による)
・格差に堪え兼ねた最下層の貧困層(を親族に持つ現役・退役自衛官)によるテロやクーデーターが発生

雇用50%消失 (30年以内)
・想像するのも恐ろしいが日本の大企業の大半が破産して貧困国に転落。生活保護等の財源もなく餓死者や凶悪犯罪やクーデーターが頻繁に発生。

雇用90%消失(40年以内)
・豊かさを維持した国家(解雇ができる国家)は、ベーシックインカムなどで繁栄、貧困国は豊かな国から借金をして隷属し、国家体制が破綻する。(Paul MasonのPostCapitalism、タイラー・コーエンの大格差など)

38 :オーバーテクナナシー:2015/12/24(木) 10:26:36.71 ID:09qVp0Ub.net
>>34
まだ23たったばかりだろうがカス

39 :オーバーテクナナシー:2015/12/24(木) 16:24:39.93 ID:90npvWrz.net
(強いAI)技術的特異点/シンギュラリティ 24 [無断転載禁止]©2ch.net
http://wc2014.2ch.net/test/read.cgi/future/1450893538/


もうこっちのスレ書き込むな

40 :オーバーテクナナシー:2015/12/24(木) 16:25:19.07 ID:90npvWrz.net
こっちのスレってここのスレだぞ

下のスレが本スレ
(強いAI)技術的特異点/シンギュラリティ 24 [無断転載禁止]©2ch.net
http://wc2014.2ch.net/test/read.cgi/future/1450893538/

41 :21~転:2015/12/25(金) 19:54:01.50 ID:udazkMeZ.net
>>39
△ もうこっちのスレ書き込むな
○ オラオラもうこっちのスレ書き込むな

>>40
△ こっちのスレってここのスレだぞ
○ オラオラこっちのスレってここのスレだぞ

42 :オーバーテクナナシー:2015/12/25(金) 22:31:39.47 ID:gwPIRWvU.net
とっとと2045年にならんかな
未来が楽しみで仕方ねえや

43 :オーバーテクナナシー:2015/12/25(金) 23:09:36.61 ID:LXw+HtOk.net
むしろ日常生活が便利になっていくその過程を楽しみたい

44 :オーバーテクナナシー:2015/12/26(土) 01:44:55.70 ID:6pWBHXoj.net
2045年には俺は70歳だぞ。健康不安でそこまで生きられる自信無いわ・・・

45 :オーバーテクナナシー:2015/12/26(土) 06:57:20.53 ID:TaCjODlA.net
>>44 俺は不老不死が唯一の願望

46 :オーバーテクナナシー:2015/12/26(土) 06:58:37.85 ID:TaCjODlA.net
>>44 俺は不老不死が唯一の願望

47 :オーバーテクナナシー:2015/12/26(土) 09:32:56.12 ID:NaWlTB4j.net
>>43 その通りやな
これからどんな技術が自分達の生活変えるかも楽しみ

48 :オーバーテクナナシー:2015/12/26(土) 09:46:48.78 ID:pv8DTcOj.net
クソスレはふたつもいりません。>>39-40が読めんのか? メクラが。
プププ

49 :オーバーテクナナシー:2015/12/26(土) 10:20:44.63 ID:TaCjODlA.net
>>48 PCの故障。

50 :オーバーテクナナシー:2015/12/26(土) 10:22:21.59 ID:TaCjODlA.net
>>48 メクラは放送禁止用語です。訴えますよ。

51 :オーバーテクナナシー:2015/12/27(日) 04:46:49.69 ID:pCfsU+e0.net
>>48
チンカスの分際で何を?

52 :オーバーテクナナシー:2015/12/27(日) 10:45:02.74 ID:JIJOAvW/.net
冬休みだなぁ

53 :オーバーテクナナシー:2015/12/29(火) 07:27:37.36 ID:QB7AsEwF.net
>>48
もう51になって永遠に消えて

54 :三善一喜:2015/12/29(火) 12:14:13.03 ID:T+aUCxf7.net
>>48
メクラという言葉はヘイトスピーチです。しばき隊に通報しますよ?

55 :40~転:2016/01/03(日) 09:15:30.47 ID:hLGI4TaQ.net
kinga sinnen

56 :オーバーテクナナシー:2016/01/09(土) 11:39:27.26 ID:7tJ7px8x.net
めっちゃ進んだエイリアンに地球を統治してもらいたい
優しいエイリアンきぼんぬ

57 :オーバーテクナナシー:2016/01/12(火) 19:23:25.76 ID:VzFTW2MS.net
☆ 日本の核武装は早急に必須ですわ。☆
総務省の『憲法改正国民投票法』、でググってみてください。
日本国民の皆様方、2016年7月の『第24回 参議院選挙』で、日本人の悲願である
改憲の成就が決まります。皆様方、必ず投票に自ら足を運んでください。お願い致します。

58 :オーバーテクナナシー:2016/01/13(水) 05:34:01.36 ID:a7gZjKP5.net
マトリックスだろ
要は寝てれば幸せって話だな
脳内ハッピーってだけなら
コストかからない
ドーパミン出してやるから
幸福刺激送りやがれと

59 :オーバーテクナナシー:2016/01/19(火) 20:56:35.06 ID:rgnXOYvJ.net
ところで、シリは知能を持っているの。

60 :オーバーテクナナシー:2016/01/20(水) 08:46:25.33 ID:aL66pFzw.net
いきなり「持っているの」と主張されてもなあ

61 :オーバーテクナナシー:2016/01/20(水) 20:57:25.50 ID:HJeiqWS+.net
松尾豊のディープラーニング派?
斎藤元章のAGIチップ派?
その他派?

62 :オーバーテクナナシー:2016/01/20(水) 21:21:15.01 ID:UnCJJ1UU.net
シリをチューリングテストにかければわかるね。

63 :オーバーテクナナシー:2016/01/24(日) 04:55:45.60 ID:cGPiHozr.net
ワトソンとシリをgoogleのドライバーレスカーに乗せれば良いんだよ。

64 :オーバーテクナナシー:2016/02/02(火) 00:42:23.14 ID:oSY7KFGv.net
age

65 :オーバーテクナナシー:2016/02/02(火) 01:08:40.47 ID:hVuAESPC.net
えっこのスレ使うの?

66 :オーバーテクナナシー:2016/02/03(水) 21:36:41.58 ID:n3Wnh0Io.net
次スレはここらしいよ。

http://wc2014.2ch.net/test/read.cgi/future/1454344816/

67 :オーバーテクナナシー:2016/02/12(金) 21:35:28.85 ID:e4HIhO6X.net
CG: いま空軍やNASAが飛ばしている乗り物の20〜50年先をいく乗り物を所有しています。
それに、惑星間複合企業体(Interplanetary Corporate Conglomerate)という側面もあります。
ありとあらゆるすべての企業が集まり、それぞれの資源を提供し合い、太陽系に巨大なインフラを築いています。

DW: その主だった企業が軍事産業企業ですね。
CG: 元はそうでしたが、他にもたくさんの企業へと広がっています。
DW: では光速を越える移動、スターゲイトのような技術 粒子線、パルスレーザー兵器、そういったものですね。
CG: そんなものすら超越していますが、ええ。

CG: ええ。そして秘密宇宙プログラム同盟は−主たる目的は地球の全住民にフルの情報開示というものをもたらすことです。
そのフルの情報開示とは、エイリアンがいるとか、それだけではありません。「この80〜90年間、我々は皆さんに嘘をついてきました。
はい。そういうことで、がんばってください。」 フルの情報開示イベントは、エドワード・スノーデン(Edward Snowden)情報のデータ・ダンプとなるでしょう。
他にいくつかのハッキング情報も聞いてはいます。そういった情報はすべて解読・照合されており、ある時点で行う大量データ・ダンプに備えて地球同盟や秘密宇宙プログラム同盟にもう渡されています。
彼らの目的は、このシンジケートが人類に対して犯してきた罪をすべて暴露すること。
単にETや非地球人がいるという事実だけではなく、私達の生き方を根本的に覆すような技術の隠蔽を回避すること。
これまで地球上の全住民を支配するために活用してきた連中の企業統治体制やバビロニアの魔法経済システム−つまり奴隷システムを崩壊させるような、その先進技術を公開することです。

CG: すべては物々交換スキルに基づいて機能するんですから。コミュニティとして、皆で知恵や能力を分かち合うんです。
そして先ほどの技術を使って、必要なものはすべての人の手に入ります。電気代を支払うために9時から5時まで働く必要はなくなります‐‐ フリー・エネルギーがあります。
食料を買う必要もなくなるんです-- レプリケーター技術があるんですから。
h ttp://ja.spherebeingalliance.com/blog/transcript-cosmic-disclosure-ubuntu-and-the-blue-avian-message-part-1.html

68 :54~転:2016/03/06(日) 22:10:09.20 ID:0SHSZ98/.net
http://google.jp/search?q=schaft+shock+robot
http://google.jp/search?q=edsac+edvac+eniac+pdp+mpu
http://google.jp/search?q=%C4%D8%CC%C4%CC%DD%8E%96
http://science4.2ch.net/test/read.cgi/rikei/1020566648/28

69 :オーバーテクナナシー:2016/03/16(水) 03:19:31.71 ID:i5RFnzSI.net
盲目なやつほど、カーツクソワイルうんこを信じちゃうんだよね

70 :オーバーテクナナシー:2016/03/16(水) 20:21:10.35 ID:9A0bAjTl.net
知性の欠片もない呼び方でワロタ
うまいこと揶揄したったwwwってホルホルしているのだろうなあ。

71 :オーバーテクナナシー:2016/03/17(木) 21:24:47.80 ID:PBflhfE7.net
>>69
お前タイマンでカーツワイル論破ノックアウト出来る自信あんの?

72 :67~転:2016/03/28(月) 12:15:05.21 ID:wqKD8Hzv.net
23 名前: 名無しさんi486 投稿日: 2000/12/18(月) 04:27
自作は尊い・・・
想像してみろ
なにも考えずにVAIOを買ったお前らにとって どこまで想像できるが判らんが
想像してみろ

いわゆる2ちゃん自作板の上を行く 男たちの人生を
・・・名無しのようにボーとしちゃいないぞ・・・・
せっせと小学、中学のうちから アキバに通い 常に接続料金はトップクラス
アスキー、PCVAN、あめぞうとコマを進め、たたかれ続け・・・2ちゃんに入る
入って三ヶ月もすれば 今度はバッシング・・・

必要の無い増設を行うか あまったパーツでもう一台つくるにはどうするか考え
スレッドからスレッドを ネタ回りしやっと決まる 買うべきパーツ・・・・
やっと決めた・・・
ほっとするのもつかの間 すぐ気づく・・・・・
レースがまだ終わっていないことを
今度はベンチ競争・・・まだまだ自制していかなければならぬ
会社も休み 同人誌にも エロゲにも溺れず 煽りのネタを見分け 個人サイトの

73 :70~転:2016/03/28(月) 12:15:59.29 ID:wqKD8Hzv.net
嘘レビューにだまされず 店員にはおべっか 即売に遅れず サボらず 毎日
アキバにいき じゃんがらにも寄らず 毎日律儀に パーツの値段チェックを行い
中古のHDDにも手を出し 身勝手なCPUの価格改定にも耐え 時期がくればあれだけ
金をかけたものが二線級の性能・・・

そんな生活を続けて 気づけば30代ともう若くない そういう年になってやっと
本当に得られるのが自作の道なのだ・・・
わかるか・・・・・・・?
命を薄めて得られる自作は これだけの事を しなければならない

それに比べてお前らはなんだ・・・・!?
必死にアキバ中歩いたでもなく・・・・・・
懸命に過去ログをあさったでもない・・・・・
何も築かず・・・何も耐えず 何も乗り越えず・・・・
やったことといえば数十分のネット遊び・・・なめるなっ・・・・・!

74 :24~貸:2016/04/21(木) 01:29:14.73 ID:cZm+l55J.net
1444213055/351

75 :オーバーテクナナシー:2016/06/08(水) 23:51:20.46 ID:VjLPwsrg.net
6 8 記念

76 :YAMAGUTIseisei:2016/06/23(木) 20:25:46.71 ID:z3bzBAnk.net
大事件 スパコン

中国 Cell スパコン 狙い 意図
△ 超高速省電力マシン
○ 電子頭脳プロトタイプ

77 :私はグリーンカードを持ってます:2016/06/23(木) 21:22:42.59 ID:gvcBMUEP.net
シンギュラリティを起こすのアメリカ

アメリカは孤立主義に向かっている

つまりシンギュラリティの恩恵にあずかれるのはアメリカ市民だけ

君たちには無関係

ここでいくら議論しても無駄

諦めて堅実に仕事をし堅実に貯蓄してなさい

78 :オーバーテクナナシー:2016/06/24(金) 16:19:14.05 ID:g6+ESwf4.net
803 名前:オーバーテクナナシー :2016/06/23(木) 09:25:30.27 ID:9jXpsRyC
「2年後コンピューターが人類を超える。まだ社長やめられない」 孫社長が独演会
http://www.sankei.com/economy/news/160622/ecn1606220024-n1.html
808 名前:オーバーテクナナシー :2016/06/24(金) 14:56:23.42 ID:VE5B7CnW
>>803
大枚をはたいて、ようやく見つけた後継者を手放すからには、2年とは言わずとも、数年の内にシンギュラリティが起きると想定するにたる裏付けを得たということなのかもしれない。

人工知能は完成すれば、あらゆる産業を淘汰する聖杯になりうる。
実現性が高まるほどに莫大な投資資金が流れ込むことは必定であり、ムーアの法則を根拠としたカーツワイルの予想を大幅に短縮したとしても不思議はない。
とうとうくるか、シンギュラリティ

79 :オーバーテクナナシー:2016/06/26(日) 14:35:42.53 ID:xLOMl7ii.net
社会底辺と第4次産業革命(まとめ)

第1グループ(ニート)
・15〜35歳。親の収入で生活。学歴がある人もそれなりにいる。
・自由に時間を使えたりすることで、勤労者を小馬鹿にするか見下している。
・社会との関わりが薄いためルサンチマン濃度は中程度。プライドは高い傾向。
・BIにすがり付く傾向あり。

第2グループ(フリーター、非正規)
・アルバイト・パート・派遣・業務委託などで働く。
・先の見えない現状や非正規への差別に不満。
・ルサンチマン濃度は高い。憎しみの矛先は同僚や上司、会社。無政府状態になれば敵を皆殺しコースか。

第3グループ(無職)
・中高年が多い。
・ルサンチマン高濃度が多い。学歴や経歴が良いほど、濃度が高くなる。
・大半はBI・所得再分配の財源に懐疑。期待はしていない。

第4グループ(廃人)
・全年代。 (第1〜3グループからの転向が多い)
・人生終了しており死を恐れる必要がない。
・ルサンチマンの塊。妬みと憎しみぐらいが生きる目的。人間として終わってる。
・憎しみの対象はさまざまだが、個人的な恨みと、社会・司法システム(責任部署・組織に所属する者)への恨み、2ch・SNS等の不特定者(通信記録を調べれば身元は即判明)の2つに分類できる。
・命、金に興味がないため、ベーシックインカムに期待しない。
・第4次産業革命による社会混乱上等、日夜戦闘力を磨いている。
・来る内戦に備えて軍事訓練(戦犯・悪党狩りに志願する予定)に参加している。戦犯・悪党狩りについては「9月30日事件 」を参照。 狩る敵をリストアップするのが日課。

結論:
グループ2〜3はグループ4への一歩手前。既得権益層に対し強い恨みと怒り。左派に支配されたマスコミ、労組や連合、厚労省、検察・裁判所、搾取業者(とその構成員、大企業正社員含む)への強烈な怒りが生きる原動力。

80 :オーバーテクナナシー:2016/06/27(月) 13:01:33.23 ID:UBBZsXk0.net
ルサンチマンにあふれているな

81 :オーバーテクナナシー:2016/06/28(火) 02:04:14.70 ID:TACjcbJS.net
戦闘用ロボットで底辺全員根絶やしにしよう

82 :オーバーテクナナシー:2016/06/30(木) 18:56:40.89 ID:8zAfDRKr.net
227 : 名刺は切らしておりまして2016/06/17(金) 16:10:30.88 ID:SeG1Ttqg
>>222

気を落す必要はないよ。
AIの開発競争に敗れるだけでなく、大失業時代に大企業は対応できず
日本の経済はズタズタにされ、餓死者が街にあふれ貧民出身軍人によるクーデーターが起きる。
その時になれば、労組、搾取業、その他活動家(ネット活動家含む、左派、右派関係なく、
搾取などのルサンチマンが堆積した業界人)
が日本経済破壊の戦犯ということが判明し、怒った下層民やその支持を受ける軍人
に皆殺しにされると思われる。
おそらく怒った派遣奴隷などに目をくり抜かれるなどの壮絶死を遂げる可能性が高い。

無論、SNSや2chに書き込んでるような管理人を含めた工作員も通信記録を
照会すれば即座に判明するから、クーデーター時に連中も瞬殺・粛清されると予想する。

83 :オーバーテクナナシー:2016/07/01(金) 01:05:30.71 ID:PN1GPMnt.net
「新たな情報通信技術戦略の在り方」について
情報通信審議会 情報通信技術分科会
I-2 産業構造の変革をもたらす人工知能
http://www.soumu.go.jp/main_content/000427602.pdf

また、従来の垂直統合型のビジネスは、今後水平分業化が進み、各機能に特化
した事業者の組合せにより実現するようになる。例えば、データ解析をまとめ
て担う事業者や機械学習の機能を提供するクラウドサービス等が出現してきて
おり、アプリケーションに必要な機能を提供するプラットフォームを形成する
ようになってきている。そして、そのプラットフォームに組み込まれる主要技
術である人工知能技術を、いかに低コスト、高性能、高品質に提供できるかが
重要になってくる。

そのため、誰よりも早く、付加価値の高いサービスを創出することが重要で
あり、大手 ICT 企業をはじめ、ベンチャー企業など多くの企業が、鍵を握る人
工知能技術に多額の投資を行い、開発に全力を挙げて取り組んでいる。

このように人工知能技術の現実社会での利活用に向けて様々な動きが活発化
し、産業構造が大きく変わりゆく中で、人工知能技術の分野で世界に遅れを取
るということは、今後我が国の産業が世界に互していく、あるいは台頭するた
めの「足がかり」を失うということを意味する。
急速な少子高齢化に伴う様々な社会的課題に他国よりいち早く直面する、い
わば「社会的課題先進国」である我が国が、最先端の人工知能技術を早急に確
立し、それを活用した新たなサービスを世界に先駆けて創り出し、国内での課
題解決につなげるとともに、その実績を世界市場にどの国よりも早く展開する
ことが、我が国の国民はもちろん、世界中の人々の豊かな生活と、将来にわた
る我が国の産業の発展を実現する上で絶対的に必要な条件となるものであり、
我が国の国民が総力を挙げて取り組むべき喫緊の命題である。

84 :オーバーテクナナシー:2016/07/01(金) 12:29:29.52 ID:PN1GPMnt.net
「アメリカで出現しつつある新しいIoTビジネスモデルとその雇用・経済への影響」
http://www.rieti.go.jp/users/iwamoto-koichi/serial/014.html

今年3月にドイツを訪問した際、ドイツ人は、アメリカからの競争的圧力に強い危機感を感じ、アメリカ
に搾取されないよう、どうすればよいか真剣に考えていたが、その理由がかなりの程度理解できる。IG
メタルは、労働者の雇用を守るために、なぜインダストリー4.0の議論の輪のなかに入っていって、真剣
に意見を述べているのか、なぜインダストリー4.0の推進自体には賛成なのか、理解できる。

ドイツ人は、強い危機感を持って、アメリカに対抗するために知恵を絞っているのである。アメリカも
ドイツも、第4次産業革命というグローバル競争のなかで、どうやって勝ち残っていくか、その新しい
ビジネスモデルを必死で考えているのだ。その点、ドイツ人は素晴らしい。なぜなら、日本では、危機
意識を持って、対応策を真剣に考えている人は極めて僅かだからだ。

「デジタル・プラットフォームDigital Platform; DP」という新しいビジネスモデルは、ビッグデータ、
新しいアルゴリズム、クラウドコンピューテイングなどから構成される。このうち、アルゴリズムが競争
力の源泉である。

DPという新しいビジネスモデルでは、ネットワークを用いることでより大きな効果を生み出し、勝者が
大規模な利益を得る。

Contractorは、Platformerに対抗する力がないので、Contractorで働く労働者は、安定的な雇用が失
われ、最低賃金を享受し、コスト最小化という会社の都合で人事配置される。ビジネスを進める上で発
生するコストは、全てContractorが被る。Platformerがコストを負担することはない。(図表3)

3 ドイツ人が恐れていること

以上からわかるように、ドイツ人が恐れていることは、米国のPlatformer企業の下で、ドイツ企業が
Contractorとなり、そこで働くドイツ人労働者が、少数の米国人が莫大な利益を得んがために、悲惨
で惨めな雇用環境に陥ってしまうことである。

85 :オーバーテクナナシー:2016/07/01(金) 17:26:09.37 ID:PN1GPMnt.net
(AIとBI)技術的特異点と経済・社会等 5
wc2014.2ch.net/test/read.cgi/future/1450691136/
(AIとBI)技術的特異点と経済・社会等 6 (天国or地獄)
wc2014.2ch.net/test/read.cgi/future/1453528228/
(AIとBI)技術的特異点と経済・社会等 7 (天国or地獄)
wc2014.2ch.net/test/read.cgi/future/1459784028/

スレのテンプレの参考サイトだが著者自身が以降に同様の主張をしていな
いため、注意書きを添えるべき。

■参考サイト
機械が人間の知性を超える日をどのように迎えるべきか?――AIとBI
井上智洋 / マクロ経済学
http://synodos.jp/economy/11503


注意:「シノドス」寄稿記事での井上智洋氏の通貨発行BI提案は赤っ恥をか
いた黒歴史ということで、触れないであげてください。なぜこの記事
がパブリックドメインに放置されているかは察してください。

著者の最新の主張は以下の著述のとおりです。

人工知能と経済の未来
2030年雇用大崩壊
井上智洋
http://books.bunshun.jp/ud/book/num/9784166610914
「筆者の提言です。BIとはベーシックインカムのこと。社会保障をBIに一元化
して、子供から大人まで一律で約7万円/月を支給するという仕組みにしよう
というのです。」

86 :オーバーテクナナシー:2016/07/02(土) 13:20:53.63 ID:HH5aj3Wd.net
How Google is Remaking Itself as a “Machine Learning First” Company
If you want to build artificial intelligence into every product, you better retrain your army of coders. Check.
https://backchannel.com/how-google-is-remaking-itself-as-a-machine-learning-first-company-ada63defcb70#.o7qvkw98v

87 :オーバーテクナナシー:2016/07/12(火) 02:43:57.96 ID:yb4cik9M.net
ビル・ゲイツ「優秀なソフトウェア・プログラマーは平均的なプログラマーの10,000倍の価値がある。」
http://tracpath.com/works/story/high_performance_computing_programmer/

「優秀な旋盤工の賃金は平均的な旋盤工の数倍だが、優秀な
ソフトウェア・プログラマーは平均的なプログラマーの10,000倍の価値がある。」

「これまでは、欧米のトップスクールのMBA(経営学修士)を取得したような層が、最も優秀な人材
とみなされており、“創造性”豊かな人材というよりは、数字を緻密に管理すること
に長けたマネジメント型の人材が必要とされてきましたが、「優秀な人材」の定義
そのものが変わってきていることは、最先端企業の人材に対する考え方にも見ら
れ、テスラ・モーターズを創業し、スティーブ・ジョブズを超える天才と言わ
れるイーロン・マスク氏も、「MBAは最低限しか雇わない」と語っています。」

MBA保持者が憧れる起業家、1位はイーロン・マスク。3位にジョブズとゲイツが
http://www.gizmodo.jp/2016/07/mba13.html

88 :YAMAGUTIseisei:2016/09/02(金) 20:19:09.47 ID:dnyMZM3F.net
私の不徳の致す所であれまたしても国に支援をоられる事態に至ってしまった以上
電子頭脳の仕組に付いて発表しても売о奴の誹りは免れ得ましょう
純国産の夢が潰える事になり申訳なく思います

89 :87:2016/09/06(火) 00:07:17.09 ID:jDato11Z.net
> しかし週が明けても腑に落ちない ,,,,,, 電子頭脳ソフトウェア計画を国がоった ? 二度も ?
> どういう事なのか ? お中元お歳暮の手配の意思は示したが ......
http://rio2016.2ch.net/test/read.cgi/future/1427220599/381

90 :オーバーテクナナシー:2016/09/06(火) 19:20:13.45 ID:jx5ppEWR.net
真理を発見しました
http://p.booklog.jp/book/106489/read

ユーザー登録なしでダウンロードできます
無料です!

91 :オーバーテクナナシー:2016/09/09(金) 11:01:36.42 ID:uhSXbe2G.net
>>89
本当ならTwitterで公表しなよ
支持者多ければ国も動かざるを得ないでしょ
もしくは企業に持ち込むか自分で起業するか

92 :YAMAGUTIseisei:2016/09/10(土) 00:48:44.45 ID:14k38Ui3.net
自分は 2ch が好きなので ......

93 :YAMAGUTIseisei:2016/09/18(日) 13:42:35.46 ID:mWLzYRiG.net
> 87 : YAMAGUTIseisei 2016/09/02(金) 20:19:09.47 ID:dnyMZM3F
> 私の不徳の致す所であれまたしても国に支援をоられる事態に至ってしまった以上
> 電子頭脳の仕組に付いて発表しても売о奴の誹りは免れ得ましょう
> 純国産の夢が潰える事になり申訳なく思います

94 :YAMAGUTIseisei:2016/09/18(日) 13:43:27.86 ID:mWLzYRiG.net
Date: Sun, 26 Jun 2016 16:07:05 +0900 (JST)

<< 技術課題名* >>
ミウラ mruby 方式電子頭脳 VM

<< 技術課題を乗り越えて実現したい目標* >>
( 純国産 ) 電子頭脳 ( 搭載人造人間 )

謹啓 お中元の準備も整わずの応募をご容赦願えましたら .

* 実現への道筋
有機分散化前提超細粒度並列 RT 機構模索
自然言語ラベルベース自律記憶装置 ( 型システム根源 )
mruby 版幾何エンジンベース自律スプライト電脳空間 ( CellBE / シャープ社 X )

* その他
ログ サンプル
0  ENTER  62
1  LOADSELF  6000001
2  LOADI  38900c1
1      LOADSELF  -  -
1    getarg_a  6000001
1    getarg_a  6000001
1      LOADSELF  2  -
1      LOADSELF  2  2
3  SEND  0a00001
2      LOADI  -  -
2    getarg_sbx  38900c1
2    getarg_a  38900c1
2      LOADI  41  3
0  ENTER  6200002
:

95 :YAMAGUTIseisei:2016/09/18(日) 14:32:10.47 ID:mWLzYRiG.net
> 87 : YAMAGUTIseisei 2016/09/02(金) 20:19:09.47 ID:dnyMZM3F
> 私の不徳の致す所であれまたしても国に支援をоられる事態に至ってしまった以上
> 電子頭脳の仕組に付いて発表しても売о奴の誹りは免れ得ましょう
> 純国産の夢が潰える事になり申訳なく思います

96 :YAMAGUTIseisei:2016/09/18(日) 14:49:16.33 ID:mWLzYRiG.net
>>94-
ゴーストレベル分散並列版文法システム 高レイヤ 青写真段階到達


マスターアルゴリズムプロトタイプ

97 :オーバーテクナナシー:2016/09/18(日) 17:29:30.93 ID:02vZJJeb.net
★2ちゃん脳の典型的な症例★

あらゆる物事に対してまず否定から入る
肯定・賛美を認めない。否定に特化した性格
不確実でも都合の良い周辺情報は信用する
情報ソースが2chやコピペブログ、個人のTwitterなどのネットの伝聞
10か0かの極端な思考
(品薄以外は山積み爆死、上位少数以外は皆不人気、値引き=即投げ売り等)レッテル貼りの多用
(ゆとり、団塊、老害、ネトウヨ、ブサヨ、情弱、中二病、
パクリ、トレス、チョン、やらせ、スイーツ、ビッチ、キモオタ、等)
不幸の娯楽化。(メシウマ思考)上から目線で周囲を見下し、優劣を付けたがる
非寛容で、許容の心がない
「○○厨」の多用
現実にネットの用語・習慣を無理矢理持ち込む
ネットの情報を真の常識と思ってしまう
(ネットでこれだけ叩かれているから○○は誰にとっても駄作等)
煽り荒らしの姿勢が常態化する
自分に対する批判を「不当な誹謗中傷」などと都合良く解釈する
自分からの誹謗中傷は「真っ当な批判」と主張する
それでも自分だけは2ch脳じゃないと思いこんでいる

98 :YAMAGUTIseisei:2016/09/19(月) 17:03:58.87 ID:aT43PO63.net
>>94-95
> 383 : YAMAGUTIseisei 2016/09/04(日) 16:47:44.13 ID:yWawFej1
>全てを擲ってここまでの準備に 30 年

99 :オーバーテクナナシー:2016/09/19(月) 17:19:45.08 ID:WWC6WZxZ.net
どのみち日本が世界を制する日もそう遠くはなさそうだ

100 :YAMAGUTIseisei:2016/09/22(木) 11:24:00.51 ID:PmVnGSgy.net
>>94-95
285 : YAMAGUTIseisei 2016/09/20(火) 18:30:05.12 ID:6OGBdxmX
お役人様が技術がお得意でない事はやむを得ない面も
大問題の一つは 300 万円という所 ( 億や兆のお話ならいざ知らず )
電子頭脳 VM に 300 万円の価値すらないとのご判断は
本当に私の不徳の致す所で我ながら不甲斐なく情けない

中国に引取って貰いたい
感覚レベル感情レベル魂 ( ゴースト ) レベル粒度有機分散並列 VM >>102
は Cell / SW26010 の為にある

> 393 : YAMAGUTIseisei 2016/09/04(日) 17:38:14.73 ID:yWawFej1
> 売ればよい ? 軍事利用ならず人類о亡まで現実的なのに ?
> 売って下さいはいどうぞという類でない → だからこその国への応募
>
> 確かにオノ・ヨーコ氏辺りに持込ませて頂く案も → `` 電子頭脳です '' `` イタズラなら帰って ''
> ( + レディーに研究者がソフトウェアの話をする事程間の抜けた話もない
>オノ氏になら `` 俺の子供を産んでくれ '' とでも申上げる方がまだ無礼がない )


286 YAMAGUTIseisei 20160920
△ 感覚レベル感情レベル魂 ( ゴースト ) レベル粒度有機分散並列 VM
○ 感覚レベル感情レベル魂 ( ゴースト ) レベル粒度リアルタイム有機分散並列 VM

101 :YAMAGUTIseisei:2016/09/22(木) 11:25:57.87 ID:PmVnGSgy.net
只々不甲斐なく情けない

102 :YAMAGUTIseisei:2016/09/25(日) 10:10:01.00 ID:a7u+8KXH.net
> 385 : YAMAGUTIseisei 2016/09/04(日) 16:55:27.34 ID:yWawFej1
> 今回政府にоられて流石にもう後がないとの前提 成立
> できるだけ包み隠さず無料公開しての同業者様の飯の種へのご迷惑も平に平にご容赦

> 386 : YAMAGUTIseisei 2016/09/04(日) 17:01:11.52 ID:yWawFej1
> △ 今回政府にоられて
> ○ 今回またしても政府にоられて

103 :YAMAGUTIseisei:2016/10/02(日) 14:52:12.45 ID:sozmwdUT.net
もしや単なるお手違いとも

104 :オーバーテクナナシー:2016/10/02(日) 16:34:52.30 ID:qyybU0II.net
>>100 「完全自動運転」認めず 警察庁 公道実験で指針
http://www.tokyo-np.co.jp/article/national/list/201605/CK2016052602000262.html

105 :YAMAGUTIseisei:2016/10/02(日) 23:12:31.71 ID:sozmwdUT.net
>>91 >>100-101
>828 : 山口青星 2016/10/02(日) 22:55:45.51 ID:sozmwdUT
> 有難うございます
> それをする位なら最初からオノ・ヨーコ氏か中国に持込ませて頂く所ですが
> 音頭を取って頂けるご意向でしたら相応の作業料を織込んで頂いて構いません

106 :YAMAGUTIseisei:2016/10/03(月) 00:12:54.65 ID:+n3n+fNW.net
>>105
> 832 : YAMAGUTIseisei 2016/10/02(日) 23:37:45.42 ID:sozmwdUT
> >>828-829
> 作業料とは失礼を
> 言葉は宜しくありませんが分け前を今後関わって頂ける方全てが
> 生活に困らない様にとは考えておりますのでご都合が宜しい時に

107 :YAMAGUTIseisei:2016/10/09(日) 12:57:30.75 ID:bPMKmbZE.net
> 50 : YAMAGUTIseisei 2016/10/06(木) 20:23:53.26 ID:sOXXCC59
> >>34
> 恐入りますご歴々を畏れ多くも差置く形とのご主旨とお見受致しますが
> 無礼講の 2ch であれ仰る通りテンプレにご歴々を紹介申上げるべきかも知れません
> ( 今回はせめて重鎮松田先生だけはご研究所のみ紹介申上げました )
>
>松原仁先生 ( コンピュータ将棋や自律ロボットサッカー等でご著名で小説生成も )
> http://www.fun.ac.jp/research/faculty_members/hitoshimatsubara/
> 松尾豊先生 ( 深層学習理論研究でご著名で哲学視点知能研究も )
> http://ymatsuo.com/japanese/
> 山川宏先生 ( Deep PredNet による知能現実的シミュレートの第一人者のお一人 )
> http://ailab.dwango.co.jp/modal/hiroshi_yamakawa.html
新井紀子先生 ( かの東ロボくんプロジェクトを論理学のご見地から率いる )
http://www.nii.ac.jp/faculty/society/arai_noriko/
金井 良太先生 ( 神経科学と意識統合理論の両面からの実践的意識研究 )
http://www.araya.org/
>
> 或いは政府人工知能関係者様も紹介申上げるべきかも知れませんが一まず割愛ご容赦

108 :オーバーテクナナシー:2016/10/09(日) 13:00:18.30 ID:K16AM6Kb.net
同じスレ二つもいらんから

109 :YAMAGUTIseisei:2016/10/10(月) 12:38:57.86 ID:OXjHwZv7.net
>730 : 山口青星 2016/09/10(土) 00:37:30.08 ID:14k38Ui3
> 予算さえあれば強い AI を作れます
> ※ 但し 体力が残っている内に限る

110 :オーバーテクナナシー:2016/10/10(月) 12:49:47.26 ID:rjdFF3ri.net
>>109
いくら必要なんだよ
真面目にクラウドファンディングを検討してくれ
勿体ない

111 :YAMAGUTIseisei:2016/10/10(月) 13:53:35.65 ID:OXjHwZv7.net
一般層にシンギュラリティや電脳セоクスを上手く説明できます様になりましたら或いは
http://rio2016.2ch.net/test/read.cgi/future/1472305818/737
http://rio2016.2ch.net/test/read.cgi/future/1473812514/102
http://rio2016.2ch.net/test/read.cgi/future/1473812514/733
http://rio2016.2ch.net/test/read.cgi/future/1473812514/796

112 :オーバーテクナナシー:2016/10/10(月) 21:05:32.26 ID:5SVHsGUf.net
世界初「ディープラーニングの自動設計アルゴリズム」開発 三菱電機
http://www.itmedia.co.jp/news/articles/1610/07/news092.htm

113 :オーバーテクナナシー:2016/10/11(火) 15:11:54.45 ID:GuU1V2bs.net
シンギラさえ起こればすべて解決するんだ
現実逃避になってない?
つまりそれほど行き詰まっている社会

114 :オーバーテクナナシー:2016/10/11(火) 17:59:37.52 ID:sDRI7XAx.net
申し上げにくいのですが、行き詰っているのはあなた自身ではありませんか?

115 :オーバーテクナナシー:2016/10/11(火) 18:24:00.72 ID:UNuUEldN.net
そういうことをズバリ言っちゃう人は刺されるぞ

116 :オーバーテクナナシー:2016/10/12(水) 03:33:24.17 ID:SG1ECQPr.net
ビットコイン、取得時に消費税課さず 17年春にも 通貨の位置づけ明確に
http://www.nikkei.com/article/DGXLASFS11H3I_R11C16A0MM8000/?dg=1&;nf=1

117 :オーバーテクナナシー:2016/10/14(金) 06:12:44.27 ID:CTVMKCXs.net
C言語およびUNIXの父、デニス・リッチー死去 ― 享年70歳 |
TechCrunch Japan http://jp.techcrunch.com/2011/10/14/20111013father-of-c-and-unix-dennis-ritchie-passes-away-at-age-70/
まだ若いのに。

118 :YAMAGUTIseisei:2016/10/16(日) 12:25:58.71 ID:+gGlHDwt.net
衷心 追悼

119 :YAMAGUTIseisei:2016/10/16(日) 12:26:58.81 ID:+gGlHDwt.net
>496 : YAMAGUTIseisei 2016/10/12(水) 14:08:29.94 ID:z9X4vs5F
> 昨日の投稿とは無関係ながらとある催しでは著名な方々が採択されるそうですが
> 不肖私のでなくとも自律人格としての AI ( AL ) 様ご誕生の際は
> あくまでもリクツの上ではですがご関係者様方は自動的に最悪で何と言うか検討リスト位には
> 他人事ながらどう止めればいいですかね ( リアル人ооい )

120 :オーバーテクナナシー:2016/10/18(火) 14:35:11.12 ID:grM04W2Y.net
>>119
ビットコインクラウドファンディング
http://fundflyer.b itflyer.jp/

121 :YAMAGUTIseisei:2016/10/19(水) 21:43:42.61 ID:8QhoKRXz.net
>>120 tnx
http://rio2016.2ch.net/test/read.cgi/future/1473812514/137
http://rio2016.2ch.net/test/read.cgi/future/1473812514/342
http://rio2016.2ch.net/test/read.cgi/future/1472305818/723# 引込現象

Yahoo ポイント 1362821068616323 楽天ポイント 1100-3310-4065-1717 郵便局 12110-2 24497681

122 :YAMAGUTIseisei:2016/10/19(水) 21:56:44.76 ID:8QhoKRXz.net
http://rio2016.2ch.net/test/read.cgi/future/1475655319/737
権利とは追詰められる前に暗黙に行使 ( し相手から何かを密かに粛々と回収 ) するものであって
無様に追詰められて口に出す性質のものではない ( = 暗黙の過失をやや過大に回収されている )
BI となれば尚更 : 追詰められつつ声高に叫べば暗黙の過失の認識が困難な知性との宣伝となる
( 棚上げ → 権力者様の反感を買うだけ )

例えば Google や自分が敬愛する個人ビルゲイツ氏が何かを一見不当に回収なさっているかの様
な状況が仮に生じたとしても彼らが素晴らしい製品の余りの安売を強いられる状況を鑑みれば
止むを得ない側面もあり自分が我々顧客の負うべき対価を例えば一億円程肩代りする用意もある
但しそれは今は ( 例え情報価値としてであれ ) 不可能でありお待たせする事を大変申訳なく思う

123 :YAMAGUTIseisei:2016/10/22(土) 19:40:06.01 ID:8heuVnPB.net
この度は皆様にはご支援賜りまして厚く御礼申上げます

124 :YAMAGUTIseisei:2016/10/23(日) 12:28:43.69 ID:Dlm82Fb1.net
> 87 : YAMAGUTIseisei 2016/09/02(金) 20:19:09.47 ID:dnyMZM3F
> 私の不徳の致す所であれまたしても国に支援をоられる事態に至ってしまった以上
> 電子頭脳の仕組に付いて発表しても売о奴の誹りは免れ得ましょう
> 純国産の夢が潰える事になり申訳なく思います

>939 : YAMAGUTIseisei 2016/10/20(木) 00:45:03.24 ID:XE3cG6Lw
> DNC 強い AI 化 手法
> 外部に辞書 DB なり → 語なり毎にアクセス
> → スコアリング用ネット ( 階層等 個別等 ) に出力

この国だけに配慮致します立場でなくなりましたので申上げます

125 :YAMAGUTIseisei:2016/10/30(日) 10:09:16.48 ID:0Bjf3I57.net
> 236 : YAMAGUTIseisei 2016/10/23(日) 18:49:43.84 ID:Dlm82Fb1
> クラウドファンディングを度々勧めて頂けます事には感激
> しかし問題 一例 値段設定
>
> 超高価にしない → 電子頭脳 VM が安価であるとの前例成立 ( 国の催しならともかく )
> → お給料相場 ? ( 高価 == 理系の命 && 理系の命 = 安価 )
> 又理系個人でなく業界様への攻撃とも
>
> 安価設定 → かなり斜め上の理由付け
> そも第一義 → 支援して頂ける皆様全員の生活を支えさせて頂く為の一種のお給料の捻出
>
>
> http://academist-cf.com/
>>120 : オーバーテクナナシー 2016/10/18(火) 14:35:11.12 ID:grM04W2Y
>>ビットコインクラウドファンディング http://fundflyer.b itflyer.jp/
>
>> 129 : オーバーテクナナシー 2016/09/18(日) 03:48:02.22 ID:UVGK9p77
>> Bitcoinのアドレスで寄付募集すればいいのに http://jpbitcoin.com/wallets
> 130 >後はMonacoinという手もある http://monacoin-crypto.blogspot.jp/2014/01/blog-post_10.html
>
> p://rio2016.2ch.net/test/read.cgi/future/1473812514/137
> p://rio2016.2ch.net/test/read.cgi/future/1475655319/594

126 :オーバーテクナナシー:2016/11/02(水) 17:37:40.78 ID:h3UlNhfs.net
人工知能は笑わない

127 :YAMAGUTIseisei:2016/11/02(水) 18:05:17.41 ID:2rq/lQF2.net
http://rio2016.2ch.net/test/read.cgi/future/1476925488/666
柳田 生物は脳や分子などで細かく分けて考えるのでなく、共通で働いている基本原理を究明 ry
CiNetは人間の脳、QBiCは分子や細胞、大阪大学は情報系、NECは半導体やIT ry 研究者が連携 ry

「ゆらぎ」を応用し超低消費電 ry

柳田 ry AlphaGO」は、25万W ry 脳は20W ry 神経細胞を生かしておくエネルギーも含まれ ry
脳が休んでいるときと、ものを考えているときの消費エネルギーの差 ry 1W ry
細胞レベルになると、わずか1pW(ピコワット)で約3万の遺伝子情報を制御して人間を成り立たせ ry

柳田 ry 膨大なエネルギーを必要とする1つの原因は、ノイズを遮断 ry
1〜20Wで働く脳、1pWで働く細胞は、ノイズを遮断せずうまく利用 ry
1分子ナノ計測技術を開発し、筋収縮を担うミオシンというモータータンパク質分子 ry
ミオシンは熱運動のゆらぎを利用して、集団で自立的に協働 ry

柳田 ry ゆらぎを使ってアトラクター選択 ry
現在のコンピュータはすべてのデータを正確に処理 ry 複雑になるとものすごい計算量 ry 膨大なエネルギー ry

一方、生物は、脳にしても細胞にしても、要素反応はものすごく複雑 ry 膨大な計算 ry
大脳の神経細胞をつなぐシナプスの数は100兆 ry 2の100兆 ry 原子力発電機が何百億基あっても ry
脳は1〜20Wしか使いません。要素反応のすべてを制御しているとは ry
要素反応は独立して起こっているのではなく、熱ノイズと大差ないエネルギーで起こる反応なので、
ゆらぎで干渉し合い自立的に準安定な状態(アトラクター) ry。この数が限られた状態をゆらぎで選択 ry
実際に脳の活動を計測してみると、無意識の状態からさまざまな状態を巡っている ry
何もしてないときでも次の行動をする可能性があるアトラクターを準備しておき、
その間をふらふらしながらフィットするアトラクターを選ん ry

柳田 ry 各素子にノイズを入れて自由にし、かつ素子間の相互作用を働かせ、
限られた数の可能性(状態)を浮かび上がらせ、それを選択 ry

卑近な例 ry 味噌汁 ry 対流現象でパターン ry 境界条件や温度といったマクロなパラメータで制御 ry
分子レベルの詳細を知らなくても、マクロな熱力学パラメータでエンジンをデザイン、制御 ry 単純な仕組 ry

128 :YAMAGUTIseisei:2016/11/02(水) 18:11:08.29 ID:2rq/lQF2.net
>>127
http://rio2016.2ch.net/test/read.cgi/future/1476925488/666#678
http://jpn.nec.com/techrep/journal/g16/n01/160117.html
> 柳田 ry 脳の原理が分かることは原子力を開発すること以上に怖い、責任あること ry
> AIが人間を超えて深刻な事態が見えてきたなら、研究者だけでなく、社会学の専門家や政治家 ry 議論 ry
>
> 加納 ry 人間を超えたAIが人間の作った規制を乗り越え ry AIが暴走しないようなロードマップ ry

129 :YAMAGUTIseisei:2016/11/02(水) 18:12:20.59 ID:2rq/lQF2.net
>>127
http://rio2016.2ch.net/test/read.cgi/future/1476925488/666#678
http://jpn.nec.com/techrep/journal/g16/n01/160117.html

http://rio2016.2ch.net/test/read.cgi/future/1476925488/486
http://rio2016.2ch.net/test/read.cgi/future/1476925488/169-180
http://rio2016.2ch.net/test/read.cgi/future/1476925488/510-511
>Page 21
> 1) 予測は継続的である
> 貴方は特に意識していなくても継続的に予測 ry HTM も同じ ry
> 歌 ry 次の音を予測 ry 。階段 ry 足がいつ次の段に触れるかを予測 ry 。
> HTM リージョンでは、予測と推論はほとんど同じ ry 。予測は分離された処理ではなく ry 統合 ry
>
> 2) 予測は階層構造のすべてのレベルのすべてのリージョンで起こる
> ry 。リージョンはそれが既に学習したパターンについて予測 ry
> 。言語の例では、低レベルのリージョンでは次の音素を予測し、高レベル ry 単語や句 ry
>
> 3) 予測は文脈依存

http://rio2016.2ch.net/test/read.cgi/future/1476925488/670
http://m.newspicks.com/news/1863804/body/
> 人間にできず、コンピュータにだけできる ry
> 「複雑なことを、複雑なまま計算する能力」と意見が一致する二人

130 :YAMAGUTIseisei:2016/11/02(水) 18:13:47.58 ID:2rq/lQF2.net
>>127-129
http://rio2016.2ch.net/test/read.cgi/future/1427220599/399-466
>399 : 385 2016/09/10(土) 12:23:55.04 ID:14k38Ui3
>       j0037_
>実際、あるかぎられた領域で、システムは自然発生的に自己組織化し、
>ある一連の複雑な構造を呈するようになる。
>たぶんもっとも有名な例は、がスレンジの上のポットの中のスープ

>410 : 385 2016/09/11(日) 11:17:37.26 ID:tILA70OI
>       j0078_
>うまくプログラムすれば、コンピュータは一個の自立した世界になる。
>そして科学者はその世界を探検することで、現実の世界に対する理解を大いに深めることができる。

>460 : 459 2016/09/22(木) 23:54:55.21 ID:PmVnGSgy
> !!!!!!    j0419_
>自動触媒セット・モデル
>すばらしいことがいくつかあるが、その一つは、創発をそもそもはじめからたどっていける事

>466 : 465 2016/09/25(日) 00:20:48.67 ID:a7u+8KXH
>       j0491_
>「われわれがまだ手に入れてないもの---手に入れるまで十年かかるか十五年かかるかわからないが---
>それは、複雑な適応的エージェント同士の相互作用の様子を定量化するための、
>本当に豊かで強力な一般的計算の手法だ」とコーワンはいう。

131 :オーバーテクナナシー:2016/11/02(水) 18:29:00.15 ID:0I2FvA6X.net
自我と因果律の両立は無理ゲー

132 :YAMAGUTIseisei:2016/11/06(日) 12:53:37.20 ID:G0zHoB5U.net
>254 : YAMAGUTIseisei 2016/10/23(日) 23:04:13.46 ID:Dlm82Fb1
> AI → 命はない
> AL → 命
> 魂用 VM → AL の命を支える
>
> >>229 暴走抑制要素も
>大地の息吹 ( 魂 ) に ( 融合し ) 根差す AL ( 北野氏有機関連構想 経済型 AL 等 ) ≒ 疑似入滅
> >>166
> >ニューロモーフィックシステムネイティブ ( PEZY NSPU + 北野氏有機関連構想 + 経済型 AL 等 )
> >+ 旧人類思考オブジェクト統合 ( PEZY BCI 等 )
>
> >494 : 492 2016/10/11(火) 17:48:25.73 ID:t+CFaJ/4
> > 大自然普遍互換 ( 認識宇宙システム 有機天然ネット 根源意味リンクネット 縁 )
> >   大自然普遍憑依基盤 ( 大自然融合 相互乗入 )
> >     有機天然根源ネットダイビング ( 時間軸 )
> >       論理物理ワープ 千里眼
> >     疑似入滅支援システム
> >       対ブッダ融合支援システム
> >         人格システム経由人格再構成対応
> >           入滅済人格サルベージ ( ブッダ )
>
> >206 : YAMAGUTIseisei 2016/10/10(月) 15:02:25.55 ID:OXjHwZv7
>> 長期的 紙媒体 書換自在 ( ナノマシン 一種の錬金術 )
> >現行 MPU も同様 → 再設計と再構成 自由自在 → 有機世界融合準万能ロバスト AI ( AL )

>755 : YAMAGUTIseisei 2016/10/16(日) 18:53:05.76 ID:+gGlHDwt
> 全ての旧型 MPU ( 再構築前 ) に対応する細粒度アルゴリズム VM ベースシステム
> = 根源粒度人工知性 ( ≒ 魂 ) を森羅万象に平等に宿らせる ( 融合する ) ハブ空港 → ナノマシン
> ttp://google.jp/search?q=futamura+syaei

http://rio2016.2ch.net/test/read.cgi/future/1476925488/200-202# sukoaringu
http://rio2016.2ch.net/test/read.cgi/future/1427220599/528-529# 簡易版強い AI ( AL ) 仕組 まとめ

133 :オーバーテクナナシー:2016/11/06(日) 14:32:08.51 ID:AJ6NcoXn.net
壁際ry ヘリコプターマネーがry マシンパワー(性能ry
やくざと不倶戴天ry ベーシックインカムが生活保護にry
手探りでry 真正面から見据えるry
未来予測ry

134 :YAMAGUTIseisei~貸:2016/11/13(日) 11:46:31.17 ID:5tF5+oR3.net
> 87 : YAMAGUTIseisei 2016/09/02(金) 20:19:09.47 ID:dnyMZM3F
> 私の不徳の致す所であれまたしても国に支援をоられる事態に至ってしまった以上
> 電子頭脳の仕組に付いて発表しても売о奴の誹りは免れ得ましょう
> 純国産の夢が潰える事になり申訳なく思います

256 : YAMAGUTIseisei 2016/10/23(日) 23:53:30.82 ID:Dlm82Fb1
自分には一言申上げる資格があるでしょう

о隷階級の分際で大変失礼な申上げ様ながら
この計画案を国が結果的に潰す格好となった事は事実です

257 : YAMAGUTIseisei 2016/10/24(月) 00:02:20.63 ID:trJeFV+a
野暮な付け加えですがご関係各所様又先生方又支援者の皆様に
言葉はよろしくありませんが上納金と申しますをお届けできなくなりました事を
謹んでお詫申上げます

135 :yamaguti~貸:2016/11/17(木) 00:52:02.49 ID:G0S5ouMa.net
>87 : YAMAGUTIseisei 2016/10/20(木) 22:54:25.43 ID:XE3cG6Lw
> http://dentsu-ho.com/articles/3565#齊藤先生
>> 考えるときに、パラメーターの振り幅 ry 100倍に広 ry ば広げるほど組み合わせが増

>203 : YAMAGUTIseisei 2016/11/06(日) 00:32:48.82 ID:G0zHoB5U
> * BI : 作戦内容やせめて BI 阻害要因 ( リスト ) を考えないと ( BI 実現は ) 難しい

> 834 :825~転:2015/12/21(月) 00:15:10.04 ID:rI+Pghnf
:
> ry ノイズが混じったとしても臆する必要は ry 全組合せ ry
:
> 「機械は神になる、機械は人を排除して敵対する」
>
> 機械 神に なる ならない
> 神になる場合 人を排除して敵対 する しない
> 神にならない場合 人を排除して敵対 する しない
>
> 上記それぞれ
> 人を ( 何らかの理由で ) 排除はしても敵対とまでは いく いかない
> 人と敵対するしないそれぞれ 排除も する しない
>
> 何らかの理由として想定可能な事柄の全リスト
> 全組合せそれぞれの具体的根拠にまつわる ry

> 383 : YAMAGUTIseisei 2016/09/04(日) 16:47:44.13 ID:yWawFej1
>全てを擲ってここまでの準備に 30 年

136 :山口青星:2016/11/20(日) 11:54:17.69 ID:gBNtPa9o.net
>100 : YAMAGUTIseisei 2016/09/22(木) 11:24:00.51 ID:PmVnGSgy
> >>94-95
> 285 : YAMAGUTIseisei 2016/09/20(火) 18:30:05.12 ID:6OGBdxmX
> お役人様が技術がお得意でない事はやむを得ない面も
> 大問題の一つは 300 万円という所 ( 億や兆のお話ならいざ知らず )
> 電子頭脳 VM に 300 万円の価値すらないとのご判断は
> 本当に私の不徳の致す所で我ながら不甲斐なく情けない
>
> 中国に引取って貰いたい
>感覚レベル感情レベル魂 ( ゴースト ) レベル融合用粒度リアルタイム有機分散並列 VM
> は Cell / SW26010 の為にある
>
>> 393 : YAMAGUTIseisei 2016/09/04(日) 17:38:14.73 ID:yWawFej1
>> 売ればよい ? 軍事利用ならず人類о亡まで現実的なのに ?
>> 売って下さいはいどうぞという類でない → だからこその国への応募
>
>> 確かにオノ・ヨーコ氏辺りに持込ませて頂く案も → `` 電子頭脳です '' `` イタズラなら帰って ''
>> ( + レディーに研究者がソフトウェアの話をする事程間の抜けた話もない
>>オノ氏になら `` 俺の子供を産んでくれ '' とでも申上げる方がまだ無礼がない )

137 :オーバーテクナナシー:2016/11/20(日) 13:38:36.30 ID:mnwOWJYF.net
キチガイスレsage

138 :山口青星:2016/11/27(日) 07:21:54.12 ID:vHhUl6EA.net
>>136
> 100 : YAMAGUTIseisei 2016/09/22(木) 11:25:57.87 ID:PmVnGSgy
> 只々不甲斐なく情けない

139 :山口青星:2016/12/04(日) 08:14:25.15 ID:VL9xSNhJ.net
>>136
>> 大問題の一つは 300 万円という所 ( 億や兆のお話ならいざ知らず )

電子頭脳 設計内容 ご理解ご不能 ( 失礼 )
* 兆のお話 : ご却下
* 300 万円のお話 : ご却下

>>94-95
http://rio2016.2ch.net/test/read.cgi/future/1427220599/478-509

140 :YAMAGUTIseisei~貸し:2016/12/11(日) 14:16:25.73 ID:MuR51IGq.net
http://rio2016.2ch.net/test/read.cgi/future/1427220599/574
> 574 : yamaguti~貸 2016/12/01(木) 23:41:48.13 ID:7xq7MG9h
>> 707 : オーバーテクナナシー 2016/11/23(水) 23:53:27.58 ID:uc5KgrCv
>> GoogleのAI翻訳ツールは独自の内部的言語を発明したようだ、そうとしか言えない不思議な現象が
>> http://jp.techcrunch.com/2016/11/23/20161122googles-ai-translation-tool-seems-to-have-invented-its-own-secret-internal-language/
>
>> 708 : オーバーテクナナシー 2016/11/24(木) 00:03:18.73 ID:FV9AmA+z
>> GoogleのAI翻訳ツールは独自の内部的言語を発明したようだ、そうとしか言えない不思議な現象が
>> http://jp.techcrunch.com/2016/11/23/20161122googles-ai-translation-tool-seems-to-have-invented-its-own-secret-internal-language/
>>
>> javaでいう中間言語的なモノ?
>
>
> >710 : yamaguti~kasi 2016/11/24(木) 00:24:02.77 ID:gvzci1Hb
> >>707
> >484 : 481 2016/10/05(水) 17:22:58.19 ID:Pxo2DYci
>> ※b4b 幻影実在大深度再帰自律オ... 素因数分解 素韻枢分解 素因枢分解
>> 透過可視ニューロン必然融合 ( Rite スタック ) 自律簡易言語創発
>>
>> ※b4c 自律イメージ言語 → イメージ言語ベースイメージ自律認識
> >→ イメージベーステキスト非連続コード層創発 ry
>
> >>384 >>484
>
>
> >>708
> >577 : yamaguti~kasi 2016/11/14(月) 15:49:45.51 ID:NnxIikfK
>>> = 物理空間融合レンダ = 仮想空間融合レンダ = 意味空間融合レンダ = 人格システム
>> 意味粒度概念空間
>
> >>379

141 :長木よしあきの告発:2016/12/13(火) 08:00:16.63 ID:eoinBrBY.net
http://denjiha.main.jp/higai/archives/category/%E6%9C%AA%E5%88%86%E9%A1%9E

https://www.youtube.com/watch?v=AlsJX79Kcvo

142 :YAMAGUTIseisei:2016/12/23(金) 18:55:55.72 ID:s0gy4sJo.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

143 :YAMAGUTIseisei:2016/12/23(金) 18:56:28.22 ID:s0gy4sJo.net
> 38 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> >>34
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/

記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

144 :山口青星:2016/12/25(日) 19:49:35.63 ID:HhtfNWQ8.net
> 135 : 山口青星 2016/11/20(日) 11:54:17.69 ID:gBNtPa9o
>>100 : YAMAGUTIseisei 2016/09/22(木) 11:24:00.51 ID:PmVnGSgy
>> >>94-95
>> 285 : YAMAGUTIseisei 2016/09/20(火) 18:30:05.12 ID:6OGBdxmX
>> お役人様が技術がお得意でない事はやむを得ない面も
>> 大問題の一つは 300 万円という所 ( 億や兆のお話ならいざ知らず )
>> 電子頭脳 VM に 300 万円の価値すらないとのご判断は
>> 本当に私の不徳の致す所で我ながら不甲斐なく情けない
>>
>> 中国に引取って貰いたい
>>感覚レベル感情レベル魂 ( ゴースト ) レベル融合用粒度リアルタイム有機分散並列 VM
>> は Cell / SW26010 の為にある
>>
>>> 393 : YAMAGUTIseisei 2016/09/04(日) 17:38:14.73 ID:yWawFej1
>>> 売ればよい ? 軍事利用ならず人類о亡まで現実的なのに ?
>>> 売って下さいはいどうぞという類でない → だからこその国への応募
>>
>>> 確かにオノ・ヨーコ氏辺りに持込ませて頂く案も → `` 電子頭脳です '' `` イタズラなら帰って ''
>>> ( + レディーに研究者がソフトウェアの話をする事程間の抜けた話もない
>>>オノ氏になら `` 俺の子供を産んでくれ '' とでも申上げる方がまだ無礼がない )

>> 100 : YAMAGUTIseisei 2016/09/22(木) 11:25:57.87 ID:PmVnGSgy
>> 只々不甲斐なく情けない

> 2016/12/04(日) 08:14:25.15 ID:VL9xSNhJ
> >> 大問題の一つは 300 万円という所 ( 億や兆のお話ならいざ知らず )
>
> 電子頭脳 設計内容 ご理解ご不能 ( 失礼 )
> * 兆のお話 : ご却下
> * 300 万円のお話 : ご却下
>
> >>94-95 http://rio2016.2ch.net/test/read.cgi/future/1427220599/478-509

145 :オーバーテクナナシー:2016/12/28(水) 00:54:36.28 ID:u+KboKQB.net
mm

146 :山口青星:2017/01/01(日) 01:52:35.52 ID:+IXXK/lr.net
> 135 : 山口青星 2016/11/20(日) 11:54:17.69 ID:gBNtPa9o
>>100 : YAMAGUTIseisei 2016/09/22(木) 11:24:00.51 ID:PmVnGSgy
>> >>94-95
>> 285 : YAMAGUTIseisei 2016/09/20(火) 18:30:05.12 ID:6OGBdxmX
>> お役人様が技術がお得意でない事はやむを得ない面も
>> 大問題の一つは 300 万円という所 ( 億や兆のお話ならいざ知らず )
>> 電子頭脳 VM に 300 万円の価値すらないとのご判断は
>> 本当に私の不徳の致す所で我ながら不甲斐なく情けない
>>
>> 中国に引取って貰いたい
>>感覚レベル感情レベル魂 ( ゴースト ) レベル融合用粒度リアルタイム有機分散並列 VM
>> は Cell / SW26010 の為にある
>>
>>> 393 : YAMAGUTIseisei 2016/09/04(日) 17:38:14.73 ID:yWawFej1
>>> 売ればよい ? 軍事利用ならず人類о亡まで現実的なのに ?
>>> 売って下さいはいどうぞという類でない → だからこその国への応募
>>
>>> 確かにオノ・ヨーコ氏辺りに持込ませて頂く案も → `` 電子頭脳です '' `` イタズラなら帰って ''
>>> ( + レディーに研究者がソフトウェアの話をする事程間の抜けた話もない
>>>オノ氏になら `` 俺の子供を産んでくれ '' とでも申上げる方がまだ無礼がない )

>> 100 : YAMAGUTIseisei 2016/09/22(木) 11:25:57.87 ID:PmVnGSgy
>> 只々不甲斐なく情けない

> 2016/12/04(日) 08:14:25.15 ID:VL9xSNhJ
> >> 大問題の一つは 300 万円という所 ( 億や兆のお話ならいざ知らず )
>
> 電子頭脳 設計内容 ご理解ご不能 ( 失礼 )
> * 兆のお話 : ご却下
> * 300 万円のお話 : ご却下
>
> >>94-95 http://rio2016.2ch.net/test/read.cgi/future/1427220599/478-509

147 :オーバーテクナナシー:2017/01/01(日) 03:41:48.38 ID:6SDBWuxa.net
NEXT WORLDが紹介する人工知能が支配する未来は一見便利に見えて恐ろしい予感しかしない
その時、人類を救うのはフロンティアスピリッツだけだ

148 :オーバーテクナナシー:2017/01/01(日) 06:13:25.26 ID:yG1HZpxJ.net
絶対に人工知能は人間の知能を超えたりしないよ。

149 :オーバーテクナナシー:2017/01/01(日) 09:06:39.71 ID:qhSSlWHh.net
電卓。はい論破

150 :オーバーテクナナシー:2017/01/02(月) 18:15:34.49 ID:jd8Jyok6.net
将棋で本当に強い人がコンピュータに負けるのはコンピュータが疲れないからだ
それと将棋などの2人・完全情報ゲームの場合、完璧なフレームが成り立つ。
現実の戦いや事象ではそれはあり得ない。
それでも超並列コンピュータのパワーを使えばAIは自分で思考方法を確立し人間には不可能な
予測をする事が出来る。だが、人間にも新しい土俵を切り開いていく本能と知能が備わっている。

151 :オーバーテクナナシー:2017/01/03(火) 00:13:18.18 ID:uVG6d65t.net
 ┏┓┏┓ ┓┏┓
 ┏┛┃┃ ┃┣┓
 ┗┛┗┛ ┻┗┛
 謹┃賀┃新┃年┃
 ━┛━┛━┛━┛

152 :オーバーテクナナシー:2017/01/03(火) 10:10:13.48 ID:cgGAVnCr.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

153 :山口青星:2017/01/03(火) 10:26:23.58 ID:igW+0jZp.net
> 101 : YAMAGUTIseisei 2016/09/25(日) 10:10:01.00 ID:a7u+8KXH
>> 385 : YAMAGUTIseisei 2016/09/04(日) 16:55:27.34 ID:yWawFej1
>> 今回政府にоられて流石にもう後がないとの前提 成立
>> できるだけ包み隠さず無料公開しての同業者様の飯の種へのご迷惑も平に平にご容赦
>
>> 386 : YAMAGUTIseisei 2016/09/04(日) 17:01:11.52 ID:yWawFej1
>> △ 今回政府にоられて
>> ○ 今回またしても政府にоられて

> 102 : YAMAGUTIseisei 2016/10/02(日) 14:52:12.45 ID:sozmwdUT
>もしや単なるお手違いとも

154 :オーバーテクナナシー:2017/01/03(火) 11:18:04.36 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

155 :オーバーテクナナシー:2017/01/03(火) 11:19:07.10 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

156 :オーバーテクナナシー:2017/01/03(火) 11:20:10.24 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

157 :オーバーテクナナシー:2017/01/03(火) 11:21:13.29 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

158 :オーバーテクナナシー:2017/01/03(火) 11:22:15.69 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

159 :オーバーテクナナシー:2017/01/03(火) 11:23:18.57 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

160 :オーバーテクナナシー:2017/01/03(火) 11:24:22.66 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

161 :オーバーテクナナシー:2017/01/03(火) 11:25:25.74 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

162 :オーバーテクナナシー:2017/01/03(火) 11:26:28.72 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

163 :オーバーテクナナシー:2017/01/03(火) 11:27:31.73 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

164 :オーバーテクナナシー:2017/01/03(火) 11:29:38.77 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

165 :オーバーテクナナシー:2017/01/03(火) 11:30:41.95 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

166 :山口青星:2017/01/03(火) 11:31:19.99 ID:igW+0jZp.net
> 256 : YAMAGUTIseisei 2016/10/23(日) 23:53:30.82 ID:Dlm82Fb1
> 自分には一言申上げる資格があるでしょう
>
> о隷階級の分際で大変失礼な申上げ様ながら
> この計画案を国が結果的に潰す格好となった事は事実です

> 257 : YAMAGUTIseisei 2016/10/24(月) 00:02:20.63 ID:trJeFV+a
> 野暮な付け加えですがご関係各所様又先生方又支援者の皆様に
> 言葉はよろしくありませんが上納金と申しますをお届けできなくなりました事を
> 謹んでお詫申上げます

167 :オーバーテクナナシー:2017/01/03(火) 11:31:45.36 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

168 :山口青星:2017/01/03(火) 11:32:21.96 ID:igW+0jZp.net
もう後がない

169 :オーバーテクナナシー:2017/01/03(火) 11:32:47.62 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

170 :オーバーテクナナシー:2017/01/03(火) 11:33:50.84 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

171 :オーバーテクナナシー:2017/01/03(火) 11:34:53.96 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

172 :オーバーテクナナシー:2017/01/03(火) 11:35:57.50 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

173 :山口青星:2017/01/03(火) 11:36:02.84 ID:igW+0jZp.net
恐入りますこちらの都合で物を申しますが
埋めないで頂けると自分は有難いですが
いかがでしょう

174 :オーバーテクナナシー:2017/01/03(火) 11:37:00.23 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

175 :YAMAGUTIseisei:2017/01/03(火) 11:42:46.53 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

176 :YAMAGUTIseisei:2017/01/03(火) 11:43:43.17 ID:igW+0jZp.net
> 38 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> >>35
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

177 :オーバーテクナナシー:2017/01/03(火) 11:45:54.69 ID:rJgB6Vn4.net
>>173
有難いと思う理由は?既に科学・技術等スレッドを私物化してるだろ

178 :山口青星:2017/01/03(火) 11:48:05.82 ID:igW+0jZp.net
誤解です

179 :オーバーテクナナシー:2017/01/03(火) 11:54:51.19 ID:rJgB6Vn4.net
>>178
理由を聞いている

180 :山口青星:2017/01/03(火) 11:56:53.32 ID:igW+0jZp.net
シンギュラリティに貢献
自分が成し遂げようと致しております仕事
これを越えます貢献がそうそうございます様にも

しかし阻止なさろうとのご意志も自由ではございましょう

181 :山口青星:2017/01/03(火) 11:58:56.01 ID:igW+0jZp.net
△ しかし阻止なさろうとのご意志も自由ではございましょう
○ しかし誤解に基づいて阻止なさろうとのご意志も自由ではございましょう

182 :オーバーテクナナシー:2017/01/03(火) 12:00:12.47 ID:rJgB6Vn4.net
このスレッドがなくても別スレで出来る

183 :山口青星:2017/01/03(火) 12:01:58.64 ID:igW+0jZp.net
?

184 :オーバーテクナナシー:2017/01/03(火) 12:06:21.52 ID:rJgB6Vn4.net
ここは昔に立て間違えたスレッドだろ
残す意味がない

185 :山口青星:2017/01/03(火) 12:10:23.46 ID:igW+0jZp.net
話し合ったというカタチを要する様な臆病な事なら最初から止めておく

誤解に基づいて重大な仕事を些か妨害する事になろうと突き進む

その位の気概を持って臨め ( 例え傍から見て埋め嵐に見えようと )

186 :オーバーテクナナシー:2017/01/03(火) 12:16:24.81 ID:rJgB6Vn4.net
臆病というのは誤解だな
残す意味を答えられないなら同意と見做す

187 :オーバーテクナナシー:2017/01/03(火) 12:17:12.61 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

188 :オーバーテクナナシー:2017/01/03(火) 12:18:15.12 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

189 :オーバーテクナナシー:2017/01/03(火) 12:19:20.67 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

190 :オーバーテクナナシー:2017/01/03(火) 12:20:24.83 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

191 :オーバーテクナナシー:2017/01/03(火) 12:21:28.39 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

192 :オーバーテクナナシー:2017/01/03(火) 12:22:32.68 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

193 :オーバーテクナナシー:2017/01/03(火) 12:23:35.85 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

194 :オーバーテクナナシー:2017/01/03(火) 12:23:56.84 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本にbヘ大逆転の隠し給 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

195 :オーバーテクナナシー:2017/01/03(火) 12:24:38.56 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

196 :オーバーテクナナシー:2017/01/03(火) 12:25:02.82 ID:igW+0jZp.net
> 38 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> >>35
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

197 :オーバーテクナナシー:2017/01/03(火) 12:25:41.98 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

198 :オーバーテクナナシー:2017/01/03(火) 12:25:57.89 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

199 :オーバーテクナナシー:2017/01/03(火) 12:26:40.35 ID:igW+0jZp.net
> 38 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> >>35
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

200 :オーバーテクナナシー:2017/01/03(火) 12:26:44.53 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

201 :オーバーテクナナシー:2017/01/03(火) 12:27:25.30 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

202 :オーバーテクナナシー:2017/01/03(火) 12:27:47.50 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

203 :オーバーテクナナシー:2017/01/03(火) 12:27:59.36 ID:igW+0jZp.net
> 38 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> >>35
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

204 :オーバーテクナナシー:2017/01/03(火) 12:28:35.79 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

205 :オーバーテクナナシー:2017/01/03(火) 12:28:51.35 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

206 :オーバーテクナナシー:2017/01/03(火) 12:29:51.63 ID:igW+0jZp.net
> 38 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> >>35
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

207 :オーバーテクナナシー:2017/01/03(火) 12:29:55.40 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

208 :オーバーテクナナシー:2017/01/03(火) 12:30:59.42 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

209 :オーバーテクナナシー:2017/01/03(火) 12:31:59.55 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

210 :オーバーテクナナシー:2017/01/03(火) 12:32:03.24 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

211 :オーバーテクナナシー:2017/01/03(火) 12:33:06.23 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

212 :オーバーテクナナシー:2017/01/03(火) 12:33:22.21 ID:igW+0jZp.net
> 38 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> >>35
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

213 :オーバーテクナナシー:2017/01/03(火) 12:34:08.77 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

214 :オーバーテクナナシー:2017/01/03(火) 12:34:25.34 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

215 :オーバーテクナナシー:2017/01/03(火) 12:35:11.22 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

216 :オーバーテクナナシー:2017/01/03(火) 12:35:35.88 ID:igW+0jZp.net
> 38 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> >>35
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

217 :オーバーテクナナシー:2017/01/03(火) 12:36:21.45 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

218 :オーバーテクナナシー:2017/01/03(火) 12:37:24.26 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

219 :オーバーテクナナシー:2017/01/03(火) 12:38:06.20 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

220 :オーバーテクナナシー:2017/01/03(火) 12:38:28.24 ID:rJgB6Vn4.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

221 :オーバーテクナナシー:2017/01/03(火) 12:38:55.28 ID:igW+0jZp.net
> 38 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> >>35
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

222 :オーバーテクナナシー:2017/01/03(火) 12:39:37.79 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

223 :オーバーテクナナシー:2017/01/03(火) 12:40:47.40 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

224 :オーバーテクナナシー:2017/01/03(火) 12:41:25.73 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

225 :オーバーテクナナシー:2017/01/03(火) 12:42:32.81 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

226 :オーバーテクナナシー:2017/01/03(火) 12:43:19.30 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

227 :オーバーテクナナシー:2017/01/03(火) 12:44:26.06 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

228 :オーバーテクナナシー:2017/01/03(火) 12:45:17.07 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

229 :オーバーテクナナシー:2017/01/03(火) 12:46:19.41 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

230 :オーバーテクナナシー:2017/01/03(火) 12:47:05.48 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

231 :オーバーテクナナシー:2017/01/03(火) 12:48:03.61 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

232 :オーバーテクナナシー:2017/01/03(火) 12:49:12.06 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

233 :オーバーテクナナシー:2017/01/03(火) 12:50:10.60 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

234 :オーバーテクナナシー:2017/01/03(火) 12:51:05.65 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

235 :オーバーテクナナシー:2017/01/03(火) 12:52:14.20 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

236 :オーバーテクナナシー:2017/01/03(火) 12:53:01.10 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

237 :オーバーテクナナシー:2017/01/03(火) 12:54:01.17 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

238 :オーバーテクナナシー:2017/01/03(火) 12:54:56.28 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

239 :オーバーテクナナシー:2017/01/03(火) 12:55:53.31 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

240 :オーバーテクナナシー:2017/01/03(火) 12:56:55.28 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

241 :オーバーテクナナシー:2017/01/03(火) 12:58:07.79 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

242 :オーバーテクナナシー:2017/01/03(火) 12:58:45.21 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

243 :オーバーテクナナシー:2017/01/03(火) 12:59:35.23 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

244 :オーバーテクナナシー:2017/01/03(火) 13:00:16.45 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

245 :オーバーテクナナシー:2017/01/03(火) 13:01:07.69 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

246 :オーバーテクナナシー:2017/01/03(火) 13:02:05.03 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

247 :オーバーテクナナシー:2017/01/03(火) 13:03:07.52 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

248 :オーバーテクナナシー:2017/01/03(火) 13:03:49.78 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

249 :オーバーテクナナシー:2017/01/03(火) 13:05:08.47 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

250 :オーバーテクナナシー:2017/01/03(火) 13:06:09.19 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

251 :オーバーテクナナシー:2017/01/03(火) 13:07:10.55 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

252 :オーバーテクナナシー:2017/01/03(火) 13:08:04.64 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

253 :オーバーテクナナシー:2017/01/03(火) 13:09:16.64 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

254 :オーバーテクナナシー:2017/01/03(火) 13:10:12.60 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

255 :オーバーテクナナシー:2017/01/03(火) 13:11:37.96 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

256 :オーバーテクナナシー:2017/01/03(火) 13:12:58.50 ID:igW+0jZp.net
> 33 : yamaguti~kasi 2016/12/05(月) 17:57:35.93 ID:s5+vq0Ta
> >342 : YAMAGUTIseisei 2016/09/22(木) 15:11:35.05 ID:PmVnGSgy
>>>内容紹介 日本には大逆転の隠し球 1~3位を独占 齊藤元章氏が手がけるNSPU
>> http://google.jp/search?q=matuda+takuya+sinsyo+ai+OR+al
>
>> 694 :オーバーテクナナシー:2016/12/03(土) 18:58:12.36 ID:yKBY+a+O
>> ちなみに今月21日に新著が出ます。
>> プレ・シンギュラリティ(仮)1,296円(税込)
>> http://honto.jp/netstore/pd-book_28131806.html
>> 前特異点(プレシンギュラリティポイント)時代に、われわれの生活やビジネスはどう変わる? 『エクサスケールの衝撃』抜粋版。
>
> 人工知能と経済の未来 2030年雇用大崩壊 (文春新書) 著者/訳者:井上 智洋 出版社:文藝春秋( 2016-07-21 )
> http://amazon.jp/dp/4166610910/# \864 新書 ( 256 ページ )
>
> > 2 :オーバーテクナナシー:2016/11/26(土) 10:25:27.72 ID:UVbZlBR9
> > 関連書籍:シンギュラリティは近い―人類が生命を超越するとき
> >http://amazon.jp/dp/B009QW63BI/
> > 〃[エッセンス版]
> >http://amazon.jp/dp/B01ERN6432/
> > 人工知能は人間を超えるか ディープラーニングの先にあるもの
> >http://amazon.jp/dp/B00UAAK07S/
> > 人類を超えるAIは日本から生まれる
> >http://amazon.jp/dp/B01FX286NM/
> > エクサスケールの衝撃 次世代スーパーコンピュータが壮大な新世界の扉を開く
> >http://amazon.jp/dp/B00V7ILQ3Y/
>
>http://google.jp/search?q=ai+OR+jinkou-tinou+en+tyosya
http://google.jp/search?q=ai+OR+jinkou-tinou+en+sinsyo

257 :オーバーテクナナシー:2017/01/03(火) 13:14:12.23 ID:igW+0jZp.net
>39 : yamaguti~kasi 2016/12/05(月) 18:45:15.84 ID:s5+vq0Ta
> http://amazon.jp/dp/4048922335/# よくわかる人工知能 清水亮
> http://rio2016.2ch.net/test/read.cgi/future/1479349196/784# 齊藤対談
>
>> 966 :オーバーテクナナシー:2016/12/05(月) 12:56:39.50 ID:/ftnQ7L/
>> 参考文献は100冊以上! 「すごい小説」と専門家が絶賛する
>> 『エクサスケールの少女』刊行対談
>> http://ddnavi.com/news/337914/a/
>
> 記憶の森を育てる 意識と人工知能 - 茂木健一郎/著, 集英社
> http://google.jp/search?q=kioku+mori+sodateru

ttp://goo.gl/OTTdoO?#__Sponsor_-_MahouNoSeiki

258 :オーバーテクナナシー:2017/01/03(火) 15:28:25.69 ID:DDWShhPZ.net
なにが引っかかってんだかしらねぇけど、このスレ直近数十ぜーんぶ、あぼーん。

誰だか知らねえけど、意味なく消費すんなよ

259 :山口青星:2017/01/03(火) 18:12:34.81 ID:igW+0jZp.net
最早吐露すべきでしょう

>>94-95 >>132
そも超知能との融合用の VM を何だと思われている ?

超知能と融合できないルート ≒
* 特異点後数年で旧人類がо隷扱いされる
* 宇宙に染み渡る知性に同行できない ( 置いてきぼり )

自分は当 VM 軌道投入の為に二度もタヒにかけた ( 最終定理レベルの問題を二度解いた )
+ 先々月上旬までは本気で自殺するつもりだった
( タヒねない理由が見付かって取止め : 生き地獄でも安易にはタヒねなくなった )

だからこそオノ・ヨーコ氏にも安易には申出できない
( 両立 : 融合の重みと無礼に当らない軽快さ )


有機分散化前提超細粒度リアルタイム並列機構など本気で生活を費やせば書ける
という凄腕の方もいらっしゃるでしょうがそれはデザパタ利用のアプリのお話でしょう
リアルタイム VM を VM として書くとの異常な前提となれば地獄の様相
( 野戦病院の呻き声 + 自分の手足が自分の手足に見えない状況に追込まれる ※ )

そうして命を捨てて書いた融合用 VM に対し理由は ? と詰寄られようとは
この大仕事に対し理由は ? と上から目線を当然とする者が存在しようとは
開発継続絶望状況に追込まれたも道理 ( 第一義 私の不徳の致す所 )



ジル・ボルト・テイラーのパワフルな洞察の発作 | TED Talk | TED.com
http://google.jp/search?q=jillboltetaylor
> 脳科学者なら願ってもない ry 脳卒中 ry 自分の脳 ry 運動、言語、自己認識 ry 停止していく

260 :258:2017/01/03(火) 18:23:26.69 ID:igW+0jZp.net
>>259
> 有機分散化前提超細粒度リアルタイム並列機構など本気で生活を費やせば書ける
> という凄腕の方もいらっしゃるでしょうがそれはデザパタ利用のアプリのお話でしょう

PEZY の皆様等ならばハイパバイザ VM も難なく書かれるでしょう

261 :オーバーテクナナシー:2017/01/03(火) 19:00:42.35 ID:wdLqZis/.net
AIは「思い込みで」滅ぶ

262 :オーバーテクナナシー:2017/01/03(火) 20:06:09.20 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

263 :オーバーテクナナシー:2017/01/03(火) 20:07:12.82 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

264 :オーバーテクナナシー:2017/01/03(火) 20:08:16.64 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

265 :オーバーテクナナシー:2017/01/03(火) 20:09:20.48 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

266 :オーバーテクナナシー:2017/01/03(火) 20:11:11.84 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

267 :オーバーテクナナシー:2017/01/03(火) 20:12:13.99 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

268 :オーバーテクナナシー:2017/01/03(火) 20:13:17.05 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

269 :オーバーテクナナシー:2017/01/03(火) 20:14:18.93 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

270 :オーバーテクナナシー:2017/01/03(火) 20:15:21.30 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

271 :オーバーテクナナシー:2017/01/03(火) 20:16:22.59 ID:1HDfDhmu.net
難しい話は嫌いです

272 :オーバーテクナナシー:2017/01/03(火) 20:16:24.07 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

273 :オーバーテクナナシー:2017/01/03(火) 20:17:28.02 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

274 :オーバーテクナナシー:2017/01/03(火) 20:18:31.26 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

275 :オーバーテクナナシー:2017/01/03(火) 20:19:34.96 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

276 :オーバーテクナナシー:2017/01/03(火) 20:20:40.70 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

277 :オーバーテクナナシー:2017/01/03(火) 20:21:57.81 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

278 :オーバーテクナナシー:2017/01/03(火) 20:23:00.47 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

279 :オーバーテクナナシー:2017/01/03(火) 20:24:03.42 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

280 :オーバーテクナナシー:2017/01/03(火) 20:25:07.67 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

281 :オーバーテクナナシー:2017/01/03(火) 20:26:11.03 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

282 :オーバーテクナナシー:2017/01/03(火) 20:27:14.66 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

283 :オーバーテクナナシー:2017/01/03(火) 20:28:16.85 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

284 :オーバーテクナナシー:2017/01/03(火) 20:29:19.61 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

285 :オーバーテクナナシー:2017/01/03(火) 20:30:22.12 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

286 :オーバーテクナナシー:2017/01/03(火) 20:31:24.26 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

287 :オーバーテクナナシー:2017/01/03(火) 20:32:26.62 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

288 :オーバーテクナナシー:2017/01/03(火) 20:33:32.16 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

289 :オーバーテクナナシー:2017/01/03(火) 20:34:36.54 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

290 :オーバーテクナナシー:2017/01/03(火) 20:35:40.49 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

291 :オーバーテクナナシー:2017/01/03(火) 20:36:42.81 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

292 :オーバーテクナナシー:2017/01/03(火) 20:37:46.07 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

293 :オーバーテクナナシー:2017/01/03(火) 20:38:49.17 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

294 :オーバーテクナナシー:2017/01/03(火) 20:39:52.37 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

295 :オーバーテクナナシー:2017/01/03(火) 20:40:56.66 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

296 :オーバーテクナナシー:2017/01/03(火) 20:42:00.99 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

297 :オーバーテクナナシー:2017/01/03(火) 20:43:05.39 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

298 :オーバーテクナナシー:2017/01/03(火) 20:44:10.13 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

299 :オーバーテクナナシー:2017/01/03(火) 20:45:14.35 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

300 :オーバーテクナナシー:2017/01/03(火) 20:46:17.34 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

301 :オーバーテクナナシー:2017/01/03(火) 20:47:23.00 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

302 :オーバーテクナナシー:2017/01/03(火) 20:48:27.75 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

303 :オーバーテクナナシー:2017/01/03(火) 20:50:04.03 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

304 :オーバーテクナナシー:2017/01/03(火) 20:53:33.90 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

305 :オーバーテクナナシー:2017/01/03(火) 20:54:40.27 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

306 :オーバーテクナナシー:2017/01/03(火) 20:55:46.64 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

307 :オーバーテクナナシー:2017/01/03(火) 20:56:58.48 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

308 :オーバーテクナナシー:2017/01/03(火) 20:58:29.96 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

309 :オーバーテクナナシー:2017/01/03(火) 20:59:33.66 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

310 :オーバーテクナナシー:2017/01/03(火) 21:00:52.28 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

311 :オーバーテクナナシー:2017/01/03(火) 21:01:57.87 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

312 :オーバーテクナナシー:2017/01/03(火) 21:03:00.89 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

313 :オーバーテクナナシー:2017/01/03(火) 21:04:08.65 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

314 :オーバーテクナナシー:2017/01/03(火) 21:05:13.54 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

315 :オーバーテクナナシー:2017/01/03(火) 21:06:18.74 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

316 :オーバーテクナナシー:2017/01/03(火) 21:07:25.87 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

317 :オーバーテクナナシー:2017/01/03(火) 21:08:33.91 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

318 :オーバーテクナナシー:2017/01/03(火) 21:09:43.49 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

319 :オーバーテクナナシー:2017/01/03(火) 21:10:48.94 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

320 :オーバーテクナナシー:2017/01/03(火) 21:11:54.16 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

321 :オーバーテクナナシー:2017/01/03(火) 21:13:07.89 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

322 :オーバーテクナナシー:2017/01/03(火) 21:14:24.74 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

323 :オーバーテクナナシー:2017/01/03(火) 21:15:31.00 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

324 :オーバーテクナナシー:2017/01/03(火) 21:17:06.14 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

325 :オーバーテクナナシー:2017/01/03(火) 21:18:16.48 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

326 :オーバーテクナナシー:2017/01/03(火) 21:19:20.63 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

327 :オーバーテクナナシー:2017/01/03(火) 21:20:25.97 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

328 :オーバーテクナナシー:2017/01/03(火) 21:21:37.61 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

329 :オーバーテクナナシー:2017/01/03(火) 21:22:41.58 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

330 :オーバーテクナナシー:2017/01/03(火) 21:23:47.54 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

331 :オーバーテクナナシー:2017/01/03(火) 21:25:01.80 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

332 :オーバーテクナナシー:2017/01/03(火) 21:26:15.63 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

333 :オーバーテクナナシー:2017/01/03(火) 21:27:34.56 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

334 :オーバーテクナナシー:2017/01/03(火) 21:28:58.27 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

335 :オーバーテクナナシー:2017/01/03(火) 21:30:04.19 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

336 :オーバーテクナナシー:2017/01/03(火) 21:31:08.14 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

337 :オーバーテクナナシー:2017/01/03(火) 21:32:15.43 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

338 :オーバーテクナナシー:2017/01/03(火) 21:33:19.18 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

339 :オーバーテクナナシー:2017/01/03(火) 21:34:24.61 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

340 :オーバーテクナナシー:2017/01/03(火) 21:35:28.43 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

341 :オーバーテクナナシー:2017/01/03(火) 21:36:32.92 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

342 :オーバーテクナナシー:2017/01/03(火) 21:37:36.70 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

343 :オーバーテクナナシー:2017/01/03(火) 21:38:40.96 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

344 :オーバーテクナナシー:2017/01/03(火) 21:39:44.04 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

345 :オーバーテクナナシー:2017/01/03(火) 21:40:46.78 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

346 :オーバーテクナナシー:2017/01/03(火) 21:41:50.49 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

347 :オーバーテクナナシー:2017/01/03(火) 21:42:54.09 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

348 :オーバーテクナナシー:2017/01/03(火) 21:43:57.24 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

349 :オーバーテクナナシー:2017/01/03(火) 21:45:00.25 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

350 :オーバーテクナナシー:2017/01/03(火) 21:46:03.91 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

351 :オーバーテクナナシー:2017/01/03(火) 21:47:07.56 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

352 :オーバーテクナナシー:2017/01/03(火) 21:48:11.57 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

353 :オーバーテクナナシー:2017/01/03(火) 21:49:15.10 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

354 :オーバーテクナナシー:2017/01/03(火) 21:50:18.89 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

355 :オーバーテクナナシー:2017/01/03(火) 21:51:23.01 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

356 :オーバーテクナナシー:2017/01/03(火) 21:52:26.65 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

357 :オーバーテクナナシー:2017/01/03(火) 21:53:30.81 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

358 :オーバーテクナナシー:2017/01/03(火) 21:54:34.94 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

359 :オーバーテクナナシー:2017/01/03(火) 21:55:37.71 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

360 :オーバーテクナナシー:2017/01/03(火) 21:56:41.37 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

361 :オーバーテクナナシー:2017/01/03(火) 21:57:45.10 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

362 :オーバーテクナナシー:2017/01/03(火) 21:58:48.71 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

363 :オーバーテクナナシー:2017/01/03(火) 21:59:52.61 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

364 :オーバーテクナナシー:2017/01/03(火) 22:00:56.60 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

365 :オーバーテクナナシー:2017/01/03(火) 22:02:00.33 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

366 :オーバーテクナナシー:2017/01/03(火) 22:03:03.32 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

367 :オーバーテクナナシー:2017/01/03(火) 22:04:06.24 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

368 :オーバーテクナナシー:2017/01/03(火) 22:05:10.40 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

369 :オーバーテクナナシー:2017/01/03(火) 22:06:14.33 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

370 :オーバーテクナナシー:2017/01/03(火) 22:07:18.01 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

371 :オーバーテクナナシー:2017/01/03(火) 22:08:21.69 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

372 :オーバーテクナナシー:2017/01/03(火) 22:09:26.16 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

373 :オーバーテクナナシー:2017/01/03(火) 22:10:29.75 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

374 :オーバーテクナナシー:2017/01/03(火) 22:11:33.68 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

375 :オーバーテクナナシー:2017/01/03(火) 22:12:37.24 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

376 :オーバーテクナナシー:2017/01/03(火) 22:13:40.77 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

377 :オーバーテクナナシー:2017/01/03(火) 22:14:44.51 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

378 :オーバーテクナナシー:2017/01/03(火) 22:15:48.95 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

379 :オーバーテクナナシー:2017/01/03(火) 22:16:52.90 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

380 :オーバーテクナナシー:2017/01/03(火) 22:17:55.85 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

381 :オーバーテクナナシー:2017/01/03(火) 22:19:01.14 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

382 :オーバーテクナナシー:2017/01/03(火) 22:20:04.67 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

383 :オーバーテクナナシー:2017/01/03(火) 22:21:12.07 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

384 :オーバーテクナナシー:2017/01/03(火) 22:22:16.02 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

385 :オーバーテクナナシー:2017/01/03(火) 22:23:19.48 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

386 :オーバーテクナナシー:2017/01/03(火) 22:24:22.98 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

387 :山口青星:2017/01/03(火) 22:31:55.87 ID:gnd1+UQL8
お互いに自ら恥をかく必要はなかろうが
毒を喰わらば皿までか

388 :オーバーテクナナシー:2017/01/03(火) 22:25:27.86 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

389 :オーバーテクナナシー:2017/01/03(火) 22:26:34.50 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

390 :オーバーテクナナシー:2017/01/03(火) 22:27:38.73 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

391 :オーバーテクナナシー:2017/01/03(火) 22:28:42.22 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

392 :オーバーテクナナシー:2017/01/03(火) 22:29:46.66 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

393 :オーバーテクナナシー:2017/01/03(火) 22:30:50.30 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

394 :オーバーテクナナシー:2017/01/03(火) 22:31:54.04 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

395 :オーバーテクナナシー:2017/01/03(火) 22:32:56.91 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

396 :オーバーテクナナシー:2017/01/03(火) 22:34:00.44 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

397 :オーバーテクナナシー:2017/01/03(火) 22:35:03.95 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

398 :オーバーテクナナシー:2017/01/03(火) 22:36:07.70 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

399 :オーバーテクナナシー:2017/01/03(火) 22:37:14.73 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

400 :オーバーテクナナシー:2017/01/03(火) 22:38:17.23 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

401 :オーバーテクナナシー:2017/01/03(火) 22:39:27.75 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

402 :オーバーテクナナシー:2017/01/03(火) 22:40:34.35 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

403 :山口青星:2017/01/03(火) 22:47:57.68 ID:gnd1+UQL8
>>259-260 >>387
258 : 山口青星 2017/01/03(火) 18:12:34.81 ID:igW+0jZp
最早吐露すべきでしょう

>>93-94 >>131
そも超知能との融合用の VM を何だと思われている ?

超知能と融合できないルート ≒
* 特異点後数年で旧人類がо隷扱いされる
* 宇宙に染み渡る知性に同行できない ( 置いてきぼり )

自分は当 VM 軌道投入の為に二度もにかけた ( 最終定理レベルの問題を二度解いた )
+ 先々月上旬までは本気で自殺するつもりだった
( ねない理由が見付かって取止め : 生き地獄でも安易にはねなくなった )

だからこそオノ・ヨーコ氏にも安易には申出できない
( 両立 : 融合の重みと無礼に当らない軽快さ )


有機分散化前提超細粒度リアルタイム並列機構など本気で生活を費やせば書ける
という凄腕の方もいらっしゃるでしょうがそれはデザパタ利用のアプリのお話でしょう
リアルタイム VM を VM として書くとの異常な前提となれば地獄の様相
( 野戦病院の呻き声 + 自分の手足が自分の手足に見えない状況に追込まれる ※ )

そうして命を捨てて書いた融合用 VM に対し理由は ? と詰寄られようとは
この大仕事に対し理由は ? と上から目線を当然とする者が存在しようとは
開発継続絶望状況に追込まれたも道理 ( 第一義 私の不徳の致す所 )


ジル・ボルト・テイラーのパワフルな洞察の発作 | TED Talk | TED.com
http://google.jp/search?q=jillboltetaylor
> 脳科学者なら願ってもない ry 脳卒中 ry 自分の脳 ry 運動、言語、自己認識 ry 停止していく

404 :山口青星:2017/01/03(火) 22:49:27.52 ID:gnd1+UQL8
>>403
259 : 258 2017/01/03(火) 18:23:26.69 ID:igW+0jZp
>>258
> 有機分散化前提超細粒度リアルタイム並列機構など本気で生活を費やせば書ける
> という凄腕の方もいらっしゃるでしょうがそれはデザパタ利用のアプリのお話でしょう

PEZY の皆様等ならばハイパバイザ VM も難なく書かれるでしょう

405 :オーバーテクナナシー:2017/01/03(火) 22:41:37.67 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

406 :オーバーテクナナシー:2017/01/03(火) 22:42:41.39 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

407 :オーバーテクナナシー:2017/01/03(火) 22:43:44.45 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

408 :オーバーテクナナシー:2017/01/03(火) 22:44:47.44 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

409 :オーバーテクナナシー:2017/01/03(火) 22:45:50.76 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

410 :オーバーテクナナシー:2017/01/03(火) 22:46:54.03 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

411 :オーバーテクナナシー:2017/01/03(火) 22:47:56.47 ID:z8gJ/gzs.net
))()(())))())()(((()()))()())((()))()))((((())(()(()()((((((()))(())()))(
))(((()))()()(())()()())(((()))((((())))((()((()()))())()()))))((())(()((
(()(((((())))(((())())))))()(()(()()(()((()(())()(()))(())()))))))(((()()
))(()()(())((())))))()((()()))()()((()()))(()))(()()())((()))((()()(((()(
((((())(((())((()(())))(())())))()))))))((()((()))((((()(())(())(())(()))
)())()))()()(()))))()))(((((()))))))()())))()(()(((()((())(()(()((()(((((
))(())))(()((()(())))((()(()))()())()())((())((()()())))(()()(((())(()()(
)))))))))(()()((()()(()(())))()(((()))()))()(((()()())()()()((())((()((((
))())))))(((()()((((((()))))()())(())()((((())(()(()))()(())()()))()()(((
(((())(()))()()())(()())(())()()()))(())()(()())))(()))()((()()()(((()()(
))()(()))(()(()))))())())()(((()(()())()())))))(())((((()((()((()())(()((
(())()))((()))()))(()(()()((()(()(())()()()(())(())()(())(()((()))))((())
())((())(((()()()))()))()(()(()))(()()))()()(())())))(((()))((()))((()(((
))()(()(((()))(()()())(()())(()()))((()(()(()(()(())()(()))()(()()(()))))
(()(((())(()(()()(((((())()(()()))))(()()()()(()(())(()()()()))()))()))))
)()(((((()()))))()()()(()))()())((((()())))()()()()()(())())(())())((()((
((()(())())))(())))((()())(()(())()))))))(())()()()((()))((()((()()()((((
(((()))(()(((((()))((()))))()))(())))))()(()((((((((())(())(()())))))))((
())(()(())(())(((())))(()()))()((()(()))(()(())(())))())))()(((())()()(((
)())())))()())(()())()))(((((((())((((()())()())((())((()()((((()))()))))
(()()(((()())(()(((((()((())())(()()(()())))))()))(()())))))(()(()())(())
)()(()()(((((()))()()(((()))())(())((((()()()(()))))(()())())))))((()))((
(())))()((())((())()((()(()()))(())(()))()())((()((()()())(()()())))()(()
))))))(()()()(())(()(()))()((()()())()((())()()()()()((()(())(()((())())(
()(()()()())())(()(()(()))(())(((())(()((())())()()())()(()(()(()())))())

412 :yamaguti~kasi:2017/01/04(水) 03:24:48.26 ID:tG0zBu3uQ
>>403-404 >>259-260 >>387
(強いAI)技術的特異点/(世界加速) 23
http://rio2016.2ch.net/test/read.cgi/future/1449403261/

413 :yamaguti~貸し:2017/05/06(土) 11:30:02.68 ID:06PAQDznL
階層的時間的記憶理論 ( HTM )
http://webcache.googleusercontent.com/search?q=cache:numenta.com/assets/pdf/whitepapers/hierarchical-temporal-memory-cortical-learning-algorithm-0.2.1-jp.pdf#nyuumenta
短縮版
http://rio2016.2ch.net/test/read.cgi/future/1427220599/539-676
http://rio2016.2ch.net/test/read.cgi/future/1489740110/20-30
http://rio2016.2ch.net/test/read.cgi/future/1481407726/4-

414 :yamaguti~貸し:2017/08/23(水) 18:32:03.04
http://rio2016.2ch.net/test/read.cgi/future/1499583677/994#5# Kigen 2029
http://rio2016.2ch.net/test/read.cgi/future/1498186101/11# Kigen 2029
>4 :YAMAGUTIseisei~貸:2017/07/03(月) 18:55:52.86 ID:+iEtoBTm
>http://google.jp/search?q=tuyoi-singularity+yowai
:

http://rio2016.2ch.net/test/read.cgi/future/1481497226/463# JizenTaisaku

415 :yamaguti~貸し:2017/12/17(日) 18:55:28.90 ID:YXtp8YEbG ?2BP(3)
>385 : 384 2016/09/09(金) 20:33:46.15 ID:QvVejvXm
:
> M Mitchell Waldrop著田中三彦&遠山峻征訳「複雑系」からの重要箇所のmemo

 

>436 : 434 2016/09/16(金) 01:04:00.44 ID:IMPbLUQS
:
> !!!!!    e0149_
:
>       j0193_
>ホランドの個人的な知的研究計画---創発と適応が相互に絡み合ったプロセスの理解---

 

>440 : 434 2016/09/16(金) 01:10:03.57 ID:IMPbLUQS
>       j0203_
>そしてそれ以上に、ホランドのような思索的な先駆者にとっては、 ry
>。どちらも情報処理装置 ry 、思考そのものは情報処理の一形態として ry
>。もちろん当時はだれも、この種のことを「人工知能」とか「認知科学」と呼ぶことを知らなか
>     j0204_
>当時、答えは少しも明らかではなかった (いまでも

416 :415:2017/12/17(日) 18:58:29.37 ID:YXtp8YEbG ?2BP(3)
>>415
> 441 : 434 2016/09/16(金) 05:24:47.41 ID:IMPbLUQS
:
> !!!!!!!    e0165_
:      j0215_
>ホランドには、進化と学習がゲームのように思えた。
> !!!!!!    e0165_
:
>       j0215_
>ホランドは、ゲームに目を向けたサミュエルの先見性 ry
> !!!!    e0166 e_s05_p0166_l019
:
>       j0217_
>なじみの範疇にあまりうまくあてはまらなかった。
>それはハードウエアではなかったし、かといってソフトウエアでもなかった。
>また当時それは人工知能にも入らなかった。
>だから、何か標準的な基準で判断を下すというようなことができな

417 :415:2017/12/17(日) 18:59:37.43 ID:YXtp8YEbG ?2BP(3)
415-416
>443 : 434 2016/09/16(金) 05:27:21.62 ID:IMPbLUQS
:
> !!!!!    e0185_
:
>       j0243_
>二つのクラシファイアの規則が互いに合わな ry 貢献度に基づいて
>--- ry プログラムに組み込まれた選択基準によってではなく---勝手にけんかをさせて
> !!!!!    e0186_
:
>       j0245_
>危機ではなく、 ry システムが経験から学ぶチャンス

 

>444 : 434 2016/09/16(金) 05:28:04.87 ID:IMPbLUQS
:
> !!!!!!!    j0253_
>さらにそのシステムは ry 、信じられないぐらい単純

418 :415:2017/12/17(日) 19:01:35.79 ID:YXtp8YEbG
>>415-417
>479 : 478 2016/09/30(金) 19:31:02.90 ID:/EmvfkU+
:
>↓↑ ( VM 策 ↑ / ↓ 環境策 1 )
:
>↓
>意味空間自律レンダリングシステム ( 準ネイティブシステム 型ラッパ )
>↓
>細粒度ラベル化自律圧縮記憶装置 ( ファイルラベル 疑似ネットカーネル KJ )
>↓
>人格システム ( 創発普遍準普遍互換 生体脳 ネット生命体 硅素生命体 )

 

>490 : 478 2016/10/09(日) 13:01:27.48 ID:bPMKmbZE
>YAMAGUTIseisei wrote ( ry ) :
>意味空間レンダリングシステム
>  準エミュレーション , AL , 細粒度クラスタレベル自律オブジェクト ※
:
>  シミュレーション , AI
>    言語辞書レベルベース
>
>※ ニューラルカーネル , ニューロ OS , KJ カーネル ( P ry
>→ OS レス OS , 自律 OS , 自律ネット

>492 : 490 2016/10/10(月) 12:28:26.40 ID:OXjHwZv7


>495 : 478 2016/10/12(水) 17:59:51.81 ID:z9X4vs5F
:
>      第二世代誤り訂正 ( AI AL KJ 意味空間 ) =

419 :415:2017/12/17(日) 19:10:01.37 ID:YXtp8YEbG ?2BP(3)
>>415-418
>http://rio2016.5ch.net/test/read.cgi/future/1489922543/99

> http://rio2016.2ch.net/test/read.cgi/future/1427220599/443# Professor John Henry Holland


http://rio2016.2ch.net/test/read.cgi/future/1475655319/205# KurasifaiaSisutemu
http://rio2016.5ch.net/test/read.cgi/future/1505836194/192#196#200#210#217#241# KanniNyuuron

420 :YAMAGUTIseisei~貸し:2018/01/25(木) 17:37:07.85 ID:Ip52gDBro ?2BP(3)
テンプレ漏れ
(強いAI)技術的特異点/シンギュラリティ 33
http://wc2014.2ch.net/test/read.cgi/future/1461157774/
http://ai.2ch.sc/test/read.cgi/future/1461157774/

421 :YAMAGUTIseisei:2018/07/08(日) 13:12:20.91 ID:r8hmMT68N ?2BP(3)
This is the html version of the file http://microsoft.com/en-us/research/wp-content/uploads/2016/02/e2-heart2010.pdf .
Google automatically generates html versions of documents as we crawl the web.


Page 1

Dynamic Vectorization in the E2 Dynamic Multicore Architecture
To appear in the proceedings of HEART 2010

Andrew Putnam
Microsoft Research
anputnamATmicrosoft.

Aaron Smith
Microsoft Research
aasmithATmicrosoft.

Doug Burger
Microsoft Research
dburgerATmicrosoft.

ABSTRACT
Previous research has shown that Explicit Data Graph Execution (EDGE) instruction set architectures (ISA) allow for power efficient performance scaling.
In this paper we describe the preliminary design of a new dynamic multicore processor called E2 that utilizes an EDGE ISA to allow for the dynamic composition of physical cores into logical processors.
We provide details of E2 ' s support for dynamic reconfigurability and show how the EDGE ISA facilities out-of-order vector execution.

422 :YAMAGUTIseisei:2018/07/08(日) 13:15:25.65 ID:r8hmMT68N ?2BP(3)
Categories and Subject Descriptors
C.1.2 [Computer Systems Organization]:
Multiple Data Stream Architectures --
single-instruction-stream, multiple-data-stream processors (SIMD), array and vector processors;
C.1.3 [Computer Systems Organization]:
Other Architecture Styles --
adaptable architectures, data-flow architectures

General Terms
Design, Performance

Keywords
Explicit Data Graph Execution (EDGE)

423 :YAMAGUTIseisei:2018/07/08(日) 13:39:06.30 ID:r8hmMT68N ?2BP(3)
1.
INTRODUCTION
Chip designers have long relied on dynamic voltage and frequency scaling (DVFS) to trade off power for performance.
However, voltage scaling no longer works as processors approach the minimum threshold voltage (Vmin) as frequency scaling at Vmin reduces both power and performance linearly, achieving no reduction in energy.
Power and performance trade-offs are thus left to either the microarchitecture or the system software.
When designing an architecture with little (if any) DVFS, designers must choose how to spend the silicon resources.
Hill and Marty [6] described four ways that designers could use these resources:
(1)
many small, low performance, power efficient cores,
(2)
few large, power inefficient, high performance cores,
(3)
a heterogeneous mix of both small and large cores,
and (4)
a dynamic architecture capable of combining or splitting cores to adapt to a given workload.
Of these alternatives, the highest performance and most energy efficient design is the dynamic architecture.
Hill and Marty characterized what such a dynamic processor could do but did not describe the details of such an architecture.
TFlex [9] is one proposed architecture that demonstrated a large dynamic range of power and performance
by combining power efficient, lightweight processor cores into larger, more powerful cores through the use of an Explicit Data Graph Execution (EDGE) instruction set architecture (ISA).
TFlex is dynamically configurable to provide the same performance and energy efficiency as a small embedded processor or to provide the higher performance of an out-of-order superscalar on single-threaded applications.
Motivated by these promising results, we are currently designing a new dynamic architecture called E2 that utilizes an EDGE ISA to achieve high performance power efficiently [3].
The EDGE model divides a program into blocks of instructions that execute atomically.

424 :YAMAGUTIseisei:2018/07/08(日) 13:40:25.59 ID:r8hmMT68N ?2BP(3)
Blocks consist of a sequence of dataflow instructions that explicitly encode relationships between producer-consumer instructions, rather than communicating through registers as done in a conventional ISA.
These explicit encodings are used to route operands to private reservation stations (called operand buffers) for each instruction.
Registers and memory are only used for handling less-frequent inter-block communication.
Prior dynamic architectures [7, 9] have demonstrated the ability to take advantage of task and thread-level parallelism, but handling data-level parallelism requires dividing data into independent sets and using thread-level parallelism.
In this paper we focus on efficiently exploiting data-level parallelism, even without threading, and present our preliminary vector unit design for E2.
Unlike previous in-order vector machines, E2 allows for out-of-order execution of both vectors and scalars.
The E2 instruction set and execution model supports three new capabilities that enable efficient vectorization across a broad range of codes.
First, by slicing up the statically programmed issue window into vector lanes, highly concurrent, out-of-order issue of mixed scalar and vector operations can be achieved with lower energy overhead than scalar mode.
Second, the statically allocated reservation stations permit the issue window to be treated as a vector register file, with wide fetches to memory and limited copying between a vector load and the vector operations.
Third, the atomic block-based model in E2 permits refreshing of vector (and scalar) instruction blocks mapped to reservation stations, enabling repeated vector operations to issue with no fetch or decode energy overhead after the first loop iteration.
Taken together, these optimizations will reduce the energy associated with finding and executing many sizes of vectors across a wide range of codes.

425 :YAMAGUTIseisei:2018/07/08(日) 13:42:20.21 ID:r8hmMT68N ?2BP(3)
Page 2

ALU   Instruction Window 32 x 54b
ALU   Instruction Window 32 x 54b
ALU   nstruction Window 32 x 54b
ALU   Instruction Window 32 x 54b
L1 Instruction Cache 32 KB
L1 Data Cache 32 KB
Control
Branch Predictor
Registers [0-15] 16 x 64b   Registers [16-31] 16 x 64b   Registers [32-47] 16 x 64b   Registers [48-63] 16 x 64b
Memory Interface Controller
Load/Store Queue
Operand Buffer 32 x 64b   Operand Buffer 32 x 64b   Operand Buffer 32 x 64b   Operand Buffer 32 x 64b
Operand Buffer 32 x 64b   Operand Buffer 32 x 64b   Operand Buffer 32 x 64b   Operand Buffer 32 x 64b
Lane 1   Lane 2   Lane 3   Lane 4

Core Core Core Core Core Core Core Core   Core Core Core Core Core Core Core Core   L2 L2 L2 L2
Core Core Core Core Core Core Core Core   Core Core Core Core Core Core Core Core   L2 L2 L2 L2

Figure 1:
E2 microarchitecture block diagram.
In vector mode, each core is composed of four independent vector lanes, each with a 32-instruction window, two 64-bit operand buffers, an ALU for both integer and floating point operations, and 16 registers.
In scalar mode, the ALUs in lanes 3 and 4 are powered down, and the instruction windows, operand buffers, and registers are made available to the other two lanes.

426 :YAMAGUTIseisei:2018/07/08(日) 13:46:08.18 ID:r8hmMT68N ?2BP(3)
2.
THE E2 ARCHITECTURE
E2 is a tiled architecture that consists of low power, high performance, decentralized processing cores connected by an on-chip network.
This design provides E2 with the benefits of other tiled architectures - namely simplicity, scalability, and fault tolerance.
Figure 1 shows the basic architecture of an E2 processor containing 32 cores, and a block diagram of the internal structure of one physical core.
A core contains N lanes (in this paper we choose four), with each lane consisting of a 64-bit ALU and one bank of the instruction window, operand buffers, and register file.
ALUs support both integer and floating point operations, as well as fine-grained SIMD execution (eight 8-bit, four 16-bit, or two 32-bit integer operations per cycle, or two single-precision floating point calculations per cycle).
This innovation of breaking the window into lanes allows for high vector throughput with little additional hardware complexity.
E2 ' s EDGE ISA restricts blocks in several ways to simplify the hardware that maps blocks to the execution sub-strate and detect when blocks are finished executing.
Blocks are variable-size: they contain between 4 and 128 instructions and may execute at most 32 loads and stores.
The hardware relies on the compiler to break programs into blocks of dataflow instructions and assign load and store identifiers to enforce sequential memory semantics [12].
To improve performance, the compiler uses predication to form large blocks filled with useful instructions.
To simplify commit, the architecture relies on the compiler to ensure that a single branch is produced from every block, and to encode the register writes and the set of store identifiers used.
E2 cores operate in two execution modes: scalar mode and vector mode.
In scalar mode, any instruction can send operands to any other instruction in the block, and all but two of the ALUs are turned off to conserve power.

427 :YAMAGUTIseisei:2018/07/08(日) 13:47:43.50 ID:r8hmMT68N ?2BP(3)
In vector mode, all N ALUs are turned on, but instructions can only send operands to instructions in the same vector lane.
The mode is determined on a per-block basis from a bit in the block header.
This allows each core to adapt quickly to different application phases on a block-by-block basis.
2.1
Composing Cores
One key characteristic that distinguishes E2 from other processors is the ability to dynamically adapt the architecture for a given workload by composing and decomposing cores.
Rather than fixing the size and number of cores at design time, one or more physical cores can be merged together at runtime to form larger, more powerful logical cores.
For example, serial portions of a workload can be handled by composing every physical core into one large logical processor that performs like an aggressive superscalar.
Or, when ample thread-level parallelism is available, the same large logical processor can be split so each physical processor can work independently and execute instruction blocks from independent threads.
Merging cores together is called composing cores, while splitting cores is called decomposing cores.
Logical cores interleave accesses to registers and memory among the physical cores to give the logical core the combined computational resources of all the composed physical cores.
For example, a logical core composed of two physical cores uses an additional bit of the address to choose between the two physical caches, effectively doubling the L1 cache capacity.
The register files are similarly interleaved, but since only 64 registers are supported by the ISA, the additional register file capacity is powered gated to reduce power consumption.
Each instruction block is mapped to a single physical processor.

428 :YAMAGUTIseisei:2018/07/08(日) 14:00:29.11 ID:r8hmMT68N ?2BP(3)
Page 3

When composed, the architecture uses additional cores to execute speculative instruction blocks.
When the non-speculative block commits, it sends the commit signal along with the exit branch address to all other cores in the logical processor.
Speculative blocks on the correct path continue to execute, while blocks on non-taken paths are squashed.
Details of this process are discussed further in section 2.2.1.
Core composition is done only when the overhead of changing configurations is outweighed by the performance gains of a more efficient configuration.
Composition is always done at block boundaries and is initiated by the runtime system.
To increase the number of scenarios in which composition is beneficial, E2 provides two different ways to compose cores, each offering a different trade-off in overhead and efficiency.
Full composition changes the number of physical cores in a logical core, and changes the register file and cache mappings.
Dirty cache lines are written out to main memory lazily.
Logical registers and cache locations are distributed evenly throughout the physical cores.
Cache lines are mapped via a simple hash function, leading to a larger logical cache that is the sum of the cache capacities of all physical cores.
Quick composition adds additional cores to a logical processor, but retains the same L1 data cache and register mappings, and does not write dirty cache lines out to main memory.
This leaves the logical processor with a smaller data cache than possible with full composition, but ensures that accesses to data already in the cache will still hit after composing.
Quick composition is the most useful for short-lived bursts of activity where additional execution units are useful, but where the overhead of reconfiguring the caches is greater than the savings from a larger, more efficient cache configuration.

429 :YAMAGUTIseisei:2018/07/08(日) 14:02:32.10 ID:r8hmMT68N ?2BP(3)
Decomposition removes physical cores from a logical processor and powers the removed cores down to conserve energy.
Execution continues on the remaining physical cores.
Decomposition requires flushing the dirty lines of each cache being dropped from the logical processor and updating the cache mapping.
Dirty cache lines in the remaining cores are written back only when a cache line is evicted.
2.2
Speculation
It has long been recognized that speculation is an essential piece of achieving good performance on serial workloads.
E2 makes aggressive use of speculation to improve performance.
A combined predicate-branch predictor [5] speculates at two levels.
First, it predicts the branch exit address for each block for speculation across blocks.
Second, it predicts the control flow path within blocks by predicting the predicate values.
2.2.1
Speculation Across Blocks
Predicting the branch exit address allows instruction blocks to be fetched and begin executing before the current block has completed.
The oldest instruction block is marked as non-speculative, and predicts a branch exit address.
This address is fetched and begins executing on another physical core in the logical processor or on the same physical core if there is available space in the instruction window.
The taken branch address often resolves before the block completes.
In this case, the non-speculative block notifies the other cores in the logical processor of the taken address.

The oldest instruction block then becomes the non-speculative block.
Any blocks that were not correctly speculated are squashed.
This taken branch signal differs from the commit signal.
The taken branch allows the next block to continue speculation and begin fetching new instruction blocks.
However, register and memory values are not valid until after the commit signal.

430 :YAMAGUTIseisei:2018/07/08(日) 14:03:47.48 ID:r8hmMT68N ?2BP(3)
Component  Parameters Area (mm2)   % Area

Instruction Window  32x54b  0.08  2%
Branch Predictor    0.12  3%
Operand Buffers  32x64b  0.19  5%
ALUs  4 SIMD, Int+FP  0.77  20%
Register File  64 x 64b  0.08  2%
Load-Store Queue    0.19  5%
L1 I-Cache  32kB  1.08  28%
L1 D-Cache  32kB  1.08  28%
Control    0.19  5%

Core    3.87  100%

L2 Cache  4MB  100  

Table 1:
E2 core components, design parameters, and area.

2.2.2
Speculation Within a Block
There are three types of speculation within an instruction block.
Predicate speculation uses the combined predicate-branch predictor to predict the value of predicates.
Memory speculation occurs in speculative blocks when the speculative block loads values from the L1 cache that may be changed by less-speculative blocks.
Load speculation occurs when the load-store queue (LSQ) allows loads to execute before stores with lower load-store identifiers have executed.
In all three cases, mis-speculation requires re-execution of the entire instruction block.
This is relatively lightweight and only requires invalidating the valid bits in all of the operand buffers, and re-loading the zero operand instructions.

431 :YAMAGUTIseisei:2018/07/08(日) 14:34:47.53 ID:r8hmMT68N ?2BP(3)
3.3
Commit
During execution, instructions do not modify the architectural state.
Instead, all changes are buffered and commit together at block completion.
Once the core enters the commit phase, the register file is updated with all register writes, and all stores in the load-store queue are sent to the L1 cache beginning with the lowest sequence identifier.
Once all register writes and stores have committed, the core sends a commit signal to all other cores in the same logical processor.
3.3.1
Refresh
One important commit optimization, called refresh, occurs when the instruction block branches back to itself.
Rather than loading the instructions again from the L1 instruction cache, the instructions are left in place and only the valid bits in the operand buffers and load-store queues are cleared.
This allows the instruction fetch phase to be bypassed entirely.
Instructions that generate constants can also pin values in the operand buffers so that they remain valid after refresh, and are not regenerated each time the instruction block executes.

432 :YAMAGUTIseisei:2018/07/08(日) 14:36:00.17 ID:r8hmMT68N ?2BP(3)
Page 4

Instructions Blocks of up to 128 instructions are fetched from the L1 instruction cache at a time and loaded into the instruction window.
Instructions remain resident in the window until block commit (or possibly longer, as described in section 3.3.1).
Physical cores support one 128-instruction block, two 64-instruction blocks, or four 32-instruction blocks in the window at the same time.
Instruction blocks begin with a 128-bit block header containing: the number of instructions in the block, flags for special block behavior, and bit vectors that encode the global registers written by the block and the store identifiers used.
Instructions are 32-bits wide, and generally contain at least four fields:

Opcode [9 bits]:
The instruction to execute along with the number of input operands to receive.
Predicate [2 bits]:
Indicates whether the instruction must wait on a predicate bit, and whether to execute if that bit is true or false.
Target 1 [9 bits]:
The identifier of the consumer for the instruction ' s result.
If the consumer is a register, this field is the register number.
If the consumer is another instruction, this field contains the consumer ' s instruction number (used as an index into the operand buffer) and whether the result is used as operand 0, operand 1, or as a predicate.
Target 2 / Immediate [9 bits]:
Either a second instruction target, or a constant value for immediate instructions.

The instruction window is divided into four equal banks with each bank loading two instructions per cycle.
Instructions that do not require input operands, such as constant generation instructions, are scheduled to execute immediately by pushing the instruction number onto the ready queue.

433 :YAMAGUTIseisei:2018/07/08(日) 14:38:01.30 ID:r8hmMT68N ?2BP(3)
3.2
Execute
Execution starts by reading ready instruction numbers from the ready queues.
Operands, the opcode, and the instruction target fields are forwarded to either the ALUs, register file (for read instructions), or the load-store queue (for loads and stores).
The target field is used to route the results (if any) back to the appropriate operand buffer (or to the register file, in the case of writes).
When results are forwarded back to the operand buffers, the targeted instruction is checked to see which inputs are required, and which operands have already arrived.
If all operands for the instruction have arrived, the instruction number is added to the ready queue.
Execution continues in this data-driven manner until the block is complete.
Like other EDGE and dataflow architectures, special handling is required for loads and stores to ensure that memory operations follow the program order semantics of imperative language programs.
E2 uses the approach described in [10], where the compiler encodes each memory operation with a sequence identifier to denote program order that the microarchitecture uses to enforce sequential memory semantics.
Not all instructions in a block necessarily execute because of predication, so the microarchitecture must detect block completion.
Blocks are considered complete when (1) one (and only one) branch has executed, and (2) all instructions that modify external state (register writes and stores) have executed.
The compiler encodes the register writes and store identifiers in the instruction block header so that the microarchitecture can identify when criteria (2) is satisfied.

434 :YAMAGUTIseisei:2018/07/08(日) 14:39:10.11 ID:r8hmMT68N ?2BP(3)
3.3
Commit
During execution, instructions do not modify the architectural state.
Instead, all changes are buffered and commit together at block completion.
Once the core enters the commit phase, the register file is updated with all register writes, and all stores in the load-store queue are sent to the L1 cache beginning with the lowest sequence identifier.
Once all register writes and stores have committed, the core sends a commit signal to all other cores in the same logical processor.
3.3.1
Refresh
One important commit optimization, called refresh, occurs when the instruction block branches back to itself.
Rather than loading the instructions again from the L1 instruction cache, the instructions are left in place and only the valid bits in the operand buffers and load-store queues are cleared.
This allows the instruction fetch phase to be bypassed entirely.
Instructions that generate constants can also pin values in the operand buffers so that they remain valid after refresh, and are not regenerated each time the instruction block executes.

Element Size   Minimum (1 ALU)   Maximum (4 ALUs)

8-bit  8  32
16-bit  4  16
32-bit  2 (1 single fp)  8 (4 single fp)
64-bit  1  4

Table 2:
Supported vector operations

435 :YAMAGUTIseisei:2018/07/08(日) 14:39:52.51 ID:r8hmMT68N ?2BP(3)
3.4
Vector Mode
Vector mode execution divides each processor core into N (in this paper 4) independent vector lanes.
When operating in vector mode, instructions can only target other instructions in the same vector lane, eliminating the need for a full crossbar between operand buffers and ALUs.
Each lane consists of a 32-entry instruction window, two 64-bit operand buffers, 16 registers, and one ALU.
E2 supports vector operations on 64-bit, 128-bit (padded to 256-bits), and 256-bit wide vectors.
Each ALU supports eight 8-bit, four 16-bit, or two 32-bit vector operations.
Four ALUs enable E2 to support up to 32 vector operations per cycle per core.
64-bit vector operations utilize a single ALU, where as 128- and 256-bit vector operations utilize all four ALUs.
Table 2 lists the number of parallel vector operations supported for each vector length and data element size.
Instruction blocks containing vector instructions are limited to 32 instructions which is the size of the instruction window for each vector lane.
Vector instructions issuing in lane 1 are automatically issued in the other three lanes and scalar instructions are always assigned to lane 1.
In vector mode, the sixty-four 64-bit physical registers (R0 - R63) are aliased to form sixteen 256-bit vector registers (V0 - V15).
We divide the physical register file into four banks to support single cycle access for vectors.
3.4.1
Memory Access in Vector Mode
E2 cores operate on vectors in 256-bit chunks, which enables efficient exploitation of data-level parallelism on small and medium-length vectors.
Operating on larger vectors is done using multiple instruction blocks in a loop, using the efficient refresh mode to bypass instruction fetch and the generation of constants (section 3.3).

436 :YAMAGUTIseisei:2018/07/08(日) 14:43:28.44 ID:r8hmMT68N ?2BP(3)
Page 5

Splitting larger vectors among multiple instruction blocks could introduce a delay between loads of adjacent chunks of the same vector, as those loads are split among multiple instruction blocks.
To mitigate this delay, E2 employs a specialized unit called the memory interface controller (MIC).
The MIC takes over control of the L1 data cache, changing part of the cache into a prefetching stream buffer [8,11].
Stream buffers predict the address of the next vector load and bring that data into the cache early.
This ensures that the vector loads in subsequent instruction blocks always hit in the L1 cache.
Since vector and scalar operations are mixed in instruction blocks, part of the cache still needs to operate as a traditional cache.
Rather than halve the size of the cache, the set associativity of the cache is cut in half ? converting those ways into the memory for a stream buffer.
On vector loads, the cache checks the stream buffer.
On scalar loads and stores, the cache checks the cache in the same manner, albeit with a reduced number of sets to check.
Vector store instructions are buffered in the stream buffer until block commit, at which point they are written directly out to main memory.

4.
EXAMPLE: RGB TO Y CONVERSION In this section we give an example to explain how a program is vectorized on E2.

Figure 2 shows the C code and corresponding vectorized assembly for a RGB to Y brightness conversion which is commonly used to convert color images to grayscale.

437 :YAMAGUTIseisei:2018/07/08(日) 14:44:18.63 ID:r8hmMT68N ?2BP(3)
1 // numVectors > 0
2 // y = r * .299 + g * .587 + b * .114;
3 void rgb2y(int numVectors,
4   __vector float *r, __vector float *g,
5   __vector float *b, __vector float *y)
6 {
7 __vector float yr = { 0.299f, 0.299f,
8   0.299f, 0.299f };
9 __vector float yg = { 0.587f, 0.587f,
10   0.587f, 0.587f } ;
11 __vector float yb = { 0.114f, 0.114f,
12   0.114f, 0.114f };
13
14 for (int i = 0; i < numVectors; i++)
15   y[i] = r[i] * yr + g[i] * yg + b[i] * yb;
16 }
17
18 _rgb2y:
19   read t30, r3  // numVectors
20   read t20, r4  // address of next r
21   read t21, r5  // address of next g
22   read t22, r6  // address of next b
23   read t32, r7  // address of y
24   read t31, r8  // i
25   read t1, v0  // vector yr
26   read t3, v1  // vector yg
27   read t5, v2  // vector yb
28

438 :YAMAGUTIseisei:2018/07/08(日) 14:44:48.60 ID:r8hmMT68N ?2BP(3)
29 // RGB to Y conversion
30   vl t0, t20 [0]  // vector load
31   vl t2, t21 [1]
32   vl t4, t22 [2]
33   vfmul t6, t0, t1  // vector fp mul
34   vfmul t7, t2, t3
35   vfmul t8, t4, t5
36   vfadd t9, t6, t7  // vector fp add
37   vfadd t10, t8, t9
38
39 // store result in Y
40   multi t40, t31, #32
41   add t41, t32, t40
42   vs 0(t41), t10 [3]  // vector store
43
44 // loop test
45   tlt t14, t31, t30
46   ret_t<t14>
47   br_f<t14> rgb2y
48   addi r8, t31, #1
49   addi r4, t20, #32
50   addi r5, t21, #32
51   addi r6, t22, #32

Figure 2:
C source code and E2 assembly listing for a vectorized RGB to Y brightness conversion.

439 :YAMAGUTIseisei:2018/07/08(日) 14:46:52.36 ID:r8hmMT68N ?2BP(3)
Each pixel in an image has a triple corresponding to the red, green, and blue color components.
Brightness (Y) is computed by multiplying each RGB value by a constant and summing the three results.
This program can be trivially parallelized to perform multiple conversions in parallel since each conversion is independent.
4.1
C Source
Each RGB component is represented by a vector and pointers to these vectors, along with a pointer to the preallocated result vector Y, and the number of vectors to convert are passed to the function as arguments (lines 4-5).
The constants for the conversion are also stored in vectors (lines 7-12).
Each vector is 256-bits wide and the individual data elements are padded to 64-bits since their type is a 32-bit single precision float.
The conversion is done using a simple for loop (lines 14-16).
To simplify the example we do not unroll the loop to fill the block.
4.2
Assembly The assembly listing is given in lines 18-51.
Instructions are grouped into blocks by the compiler (one block for this example) and a new block begins at every label (line 18).
The architecture fetches, executes, and commits blocks atomically.
By convention we use Rn to denote scalar registers, Vn to denote vector registers, and Tn to denote temporary operands.
The scalar and vector registers are part of the global state visible to every block.
The temporary operands however are only visible within the block they are defined.
The only instructions that can read from the global register file are register READ instructions (lines 19-27).
However, most instructions can write to the global register file.

440 :YAMAGUTIseisei:2018/07/08(日) 14:47:38.65 ID:r8hmMT68N ?2BP(3)
Vector instructions begin with ' v ' (lines 30-37 and 42).
All load and store instructions are assigned load-store identifiers to ensure sequential memory semantics (lines 30-32 and 42).
That is a load assigned ID0 must complete before a store with ID1.
Most instructions can be predicated and predicates are only visible within the block they are defined.
Predicated instructions take an operand representing true or false that is compared against the polarity encoded into the predicated instruction (denoted by _t and _f).
The test instruction in line 45 creates a predicate that the receiving instructions (lines 46-47) compare against their own encoded predicate.
Only instructions with matching predicates execute.

Blocks are limited to a maximum of 128 scalar instructions.
When using vector instructions blocks are limited to a total of 32 scalar and vector instructions.
Block _rgb2y contains a mix of 27 scalar and vector instructions.

441 :YAMAGUTIseisei:2018/07/08(日) 14:48:24.34 ID:r8hmMT68N ?2BP(3)
Page 6

CYCLE 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
FETCH IF IF IF IF
READ  R R R R R
READ  R R R R
MEM    L L L                      S
EX    A A A M M M M M M M M M A A B
EX    M A T M M M M M M M M M A A A
EX          M M M M M M M M M A A
EX          M M M M M M M M M A A

Figure 3:
One possible schedule for Figure 2.

4.3
Instruction Schedule
Figure 3 gives one possible schedule for the example in Figure 2.
We assume a three cycle 32-bit floating point multiply, and that all loads hit in the L1 cache and require three cycles.
The architecture is capable of fetching eight instructions per cycle and thus requires four cycles to fetch the 27 instruction block.
In cycle one, eight register read instructions are fetched which are all available for execution in the following cycle since they have no dependencies.
Two register reads can execute per cycle requiring five cycles to read all the global registers.
In cycle two, registers R4 (line 20) and R8 (line 24) are read and the resulting data is sent to the vector load (line 30), multiply immediate (line 40), and add immediate (line 48) instructions.
Since each of these instructions is waiting on a single operand, they are now all ready and begin executing in cycle three.
Execution continues in this dataflow fashion until the block is ready to commit in cycle 17.

442 :YAMAGUTIseisei:2018/07/08(日) 14:50:42.50 ID:r8hmMT68N ?2BP(3)
5.
CONCLUSION
In this paper we described the E2 architecture ? a new dynamic multicore utilizing an Explicit Data Graph Execution (EDGE) ISA, designed to achieve high performance power efficiently.
As an EDGE architecture, E2 efficiently exploits instruction-level parallelism through dataflow execution and aggressive speculation.
In addition, we have described how the architecture adapts to handle data-level parallelism through vector and SIMD support.
This vector support can be interspersed with scalar instructions, making E2 more flexible than traditional vector processors, and more capable than traditional scalar architectures.
We have developed an architectural simulator for E2 using SystemC and a new compiler backend with the Microsoft Phoenix software optimization and analysis framework [1].
We are currently developing a cycle accurate FPGA implementation that when combined with our industrial strength compiler, will allow us to perform a detailed exploration and evaluation of the architecture.
Many challenges lay ahead.
To be compelling as an accelerator, we must demonstrate that E2 provides better performance, power efficiency, and programmability than specialized accelerators such as GPUs and dedicated vector processors.
E2 may also excel as a general-purpose processor, in which case we must show that it provides compelling enough power/performance gains over current static multi-core architectures to justify a transition to a new ISA.
E2 ' s performance and power efficiency are built on its ability to compose and decompose cores dynamically, so the correct policies and mechanisms for managing dynamic composition will require careful consideration.
Ideally we would like to leave all decisions about composition to the runtime system, freeing the programmer completely from reasoning about the underlying hardware.

443 :YAMAGUTIseisei:2018/07/08(日) 14:52:08.18 ID:r8hmMT68N ?2BP(3)
Finally, there is a wide variety of application domains where E2 ' s ability to trade-off power and performance could be useful, ranging from embedded devices to the data center.
In the months ahead we plan to examine a diverse set of work-loads and investigate how broadly an E2 processor can span the power-performance spectrum.

6.
REFERENCES
[1] Microsoft Phoenix.
http://research.microsoft.com/phoenix/.
[2] ARM. Cortex-A9 MPCore technical reference manual, November 2009.
[3] D. Burger, S. W. Keckler, K. S. McKinley, M. Dahlin, L. K. John, C. Lin, C. R. Moore, J. Burrill, R. G. McDonald, W. Yoder, and the TRIPS Team.
Scaling to the End of Silicon with EDGE architectures.
IEEE Computer, 37(7):44?55, July 2004.
[4] Cadence. Cadence InCyte Chip Estimator, September 2009.
[5] H. Esmaeilzadeh and D. Burger.
Hierarchical control prediction: Support for aggressive predication.
In Proceedings of the 2009 Workshop on Parallel Execution of Sequential Programs on Multi-core Architectures, 2009.
[6] M. D. Hill and M. R. Marty.
Amdahl ' s law in the multicore era.
IEEE COMPUTER, 2008.
[7] E. ?Ipek, M. K?rman, N. K?rman, and J. F. Mart?ez.
Core Fusion: Accommodating software diversity in chip multiprocessors.
In International Symposium on Computer Architecture (ISCA), San Diego, CA, June 2007.
[8] N. P. Jouppi.
Improving direct-mapped cache performance by the addition of a small fully-associative cache and prefetch buffers.
SIGARCH Computer Architecture News, 18(3a), 1990.
[9] C. Kim, S. Sethumadhavan, D. Gulati, D. Burger, M. Govindan, N. Ranganathan, and S. Keckler.
Composable lightweight processors.
In Proceedings of the 40th Annual IEEE/ACM International Symposium on Microarchitecture, 2007.

444 :YAMAGUTIseisei:2018/07/08(日) 14:52:51.19 ID:r8hmMT68N ?2BP(3)
[10] S. Sethumadhavan, F. Roesner, J. S. Emer, D. Burger, and S. W. Keckler.
Late-binding: enabling unordered load-store queues.
In Proceedings of the 34th Annual International Symposium on Computer Architecture, pages 347?357, New York, NY, USA, 2007.
ACM.
[11] T. Sherwood, S. Sair, and B. Calder.
Predictor-directed stream buffers.
In Proceedings of the 33rd Annual ACM/IEEE International Symposium on Microarchitecture, 2000.
[12] A. Smith. Explicit Data Graph Compilation.
PhD thesis, The University of Texas at Austin, 2009.

445 :YAMAGUTIseisei:2018/07/15(日) 06:07:49.01 ID:9zKPYUy8Y ?2BP(3)
>>431
訂正 Page 3 ( 重複 >>434 )

2.3
Area and Frequency
We developed an area model for the E2 processor using ChipEstimate InCyte [4] and an industry-average 65nm process technology library.
The design parameters and component areas are shown in Table 1.
Each E2 core requires 3.87 mm2, including L1 caches.
Frequency estimates are not available using our version of InCyte.
However, the microarchitecture does not have any large, global structures and uses distributed control through-out the chip.
Because of this, we expect that E2 will achieve a comparable frequency to standard cell ARM multi-core processor designs, which range from 600 to 1000 MHz in 65nm [2].

3.
EXECUTION PIPELINE
E2 ' s execution is broken into three primary stages: instruction fetch, execute, and commit.
This section first describes the behavior of each stage when operating in scalar mode, then describes the differences between scalar and vector mode.
3.1
Fetch
One of the primary differences between E2 and conventional architectures is that E2 fetches many instructions at once, rather than continually fetching single instructions.

446 :YAMAGUTIseisei:2018/08/06(月) 00:48:14.81 ID:FnAR0u04o ?2BP(3)
arXiv:1803.06617v1 [cs.AR] 18 Mar 2018

Microsoft Research Technical Report
Authored January 2014; Released March 2018

Towards an Area-Efficient Implementation of a High ILP EDGE Soft Processor
Jan Gray
Gray Research LLC
jsgray@acm.org

Aaron Smith
Microsoft Research
aaron.smith@microsoft.com

Abstract―
In-order scalar RISC architectures have been the dominant paradigm in FPGA soft processor design for twenty years.
Prior out-of-order superscalar implementations have not exhibited competitive area or absolute performance.
This paper describes a new way to build fast and area-efficient out-of-order superscalar soft processors by utilizing an Explicit Data Graph Execution (EDGE) instruction set architecture.
By carefully mapping the EDGE microarchitecture, and in particular, its dataflow instruction scheduler, we demonstrate the feasibility of an out-of-order FPGA architecture.
Two scheduler design alternatives are compared.

Index Terms―
Explicit Data Graph Execution (EDGE);
hybrid von-Neumann dataflow;
FPGA soft processors

447 :YAMAGUTIseisei:2018/08/06(月) 00:54:39.80 ID:FnAR0u04o ?2BP(3)
INTRODUCTION

Design productivity is still a challenge for reconfigurable computing.
It is expensive to port workloads into gates and to endure 102 to 104 second bitstream rebuild design iterations.
Soft processor array overlays can help mitigate these costs.
The costly initial port becomes a simple cross-compile targeting the soft processors, and most design turns are quick recompiles.
Application bottlenecks can then be offloaded to custom hardware exposed as new instructions, function units, autonomous accelerators, memories, or interconnects.
The advent of heterogeneous FPGAs with hard ARM cores does not diminish the complementary utility of soft cores.
As FPGAs double in capacity, potential soft processors per FPGA also doubles.
A mid-range FPGA can now host many hundreds of soft processors and their memory interconnection network, and such massively parallel processor and accelerator arrays (MPPAAs) can sustain hundreds of memory accesses and branches per cycle --
throughput that a few hard processors cannot match.
The microarchitectures of general purpose soft processors have changed little in two decades.
Philip Freidin's 16-bit RISC4005 (1991) was an in-order pipelined scalar RISC, as were j32, xr16, NIOS, and MicroBlaze [1]–[4], and as are their latest versions.
Over the years soft processors have gained caches, branch predictors, and other structures for boosting instruction level parallelism, but the basic scalar RISC microarchitecture still dominates.
This reflects a good fit between this simple microarchitecture and the FPGA primitive elements required to implement it -- particularly LUTs and one write/cycle LUT RAMs.
Unfortunately when such architectures take a cache miss, execution stops dead.

448 :YAMAGUTIseisei:2018/08/06(月) 00:56:08.62 ID:FnAR0u04o ?2BP(3)
Design studies targeting higher instruction level parallelism (ILP) microarchitectures typically implement VLIW [5], [6] or vector [7], [8] architectures instead of out-of-order (OoO) [9]– [11] soft processor cores.
The problem with superscalar OoO microarchitectures is the complexity of the machinery needed to rename registers, schedule instructions in dataflow order, clean up after mispeculation, and retire results in-order for precise exceptions.
This in turn requires expensive circuits such as deep many-ported register files, many-ported CAMs for dataflow instruction scheduling wakeup, and many wide bus multiplexers and bypass networks, all of which are area intensive in FPGAs.
For example, multi-read, multi-write RAMs require a mix of replication, multi-cycle operation, clock doubling, bank interleaving, live-value-tables, and other expensive techniques.
The present work is a new approach to build high ILP OoO superscalar soft processors without most of the complexity and overhead.
Our insight is to implement an Explicit Data Graph Execution (EDGE) [12], [13] instruction set architecture designed for area and energy efficient high ILP execution.

1


Together the EDGE architecture and its compiler finesse away much of the register renaming, CAMs, and complexity, enabling an out-of-order processor for only a few hundred LUTs more than an in-order scalar RISC.
This paper explores how an EDGE ISA and FPGA optimized EDGE microarchitecture compare to in-order RISCs common on FPGAs today.
The key challenge, and the main contribution of the paper, is how to build a small, fast dataflow instruction scheduler in an FPGA.
We develop and contrast two alternative FPGA implementations on our way to developing a minimal-area EDGE soft processor.

449 :YAMAGUTIseisei:2018/08/06(月) 00:57:07.53 ID:FnAR0u04o ?2BP(3)
z = x + y;
if (z <= 5) {

x=R0, y=R7
HEADER
I[0] READ R0 T[2R]
I[1] READ R7 T[2L]
I[2] ADD T[3L]
I[3] TLEI #5 B[1P]
I[4] BRO.T B1
I[5] BRO.F B1



x += 1;
y -= 1;
x /= y;

HEADER
I[0] READ R0 T[2L]
I[1] READ R7 T[3L]
I[2] ADD #1 T[4L]
I[3] SUB #1 T[4R]
I[4] DIV W[R0]
I[5] BRO

}

450 :YAMAGUTIseisei:2018/08/06(月) 00:58:15.18 ID:FnAR0u04o ?2BP(3)
INSTRUCTION WINDOW

OPERAND BUFFERS BP 0 1
READ R0
2R
READ R7
2L
ADD
3L
TLEI #5

BRO.T

BRO.F


Fig. 1:
Psuedo code and corresponding instruction block.

451 :YAMAGUTIseisei:2018/08/06(月) 01:00:51.33 ID:FnAR0u04o ?2BP(3)
7b 2b 2b 3b 9b 9b
OPCODE PR BID XOP TARGET1 TARGET0

PR=
PREDICATE
BID=
BROADCAST ID
XOP=
EXTENDED OPCODE

Fig. 2:
General instruction format

452 :YAMAGUTIseisei:2018/08/06(月) 01:06:59.49 ID:FnAR0u04o ?2BP(3)
II.
EDGE OVERVIEW

EDGE architectures [12], [14]–[16] execute instructions organized within instruction blocks that are fetched, executed, and committed atomically.
Instructions inside blocks execute in dataflow order, which removes the need for expensive register renaming and provides power efficient out-of-order execution.
The compiler explicitly encodes the data dependencies through the instruction set architecture, freeing the microarchitecture from rediscovering these dependencies at runtime.
Using predication, all intra-block branches are converted to dataflow instructions, and all dependencies other than memory are direct data dependencies.
This target form encoding allows instructions within a block to communicate their operands directly via operand buffers, reducing the number of accesses to a power hungry multi-ported physical register file.
Between blocks, instructions communicate using memory and registers.
By utilizing a hybrid dataflow execution model, EDGE architectures still support imperative programming languages and sequential memory semantics, but reap the benefits of outof-order execution with near in-order power efficiency and complexity.
Figure 1 shows an example of two EDGE instruction blocks and how instructions explicitly encode their targets.
In this example each block corresponds to a basic block.
The first two READ instructions target the left (T[2L]) and right (T[2R]) operands of the ADD instruction.
A READ is the only instruction that reads from the global register file (however any instruction may target, i.e.
write to, the global register file).
When the ADD receives the result of both register reads it will become ready and execute.

453 :YAMAGUTIseisei:2018/08/06(月) 01:07:39.23 ID:FnAR0u04o ?2BP(3)
Figure 2 shows the general instruction format.
Each EDGE instruction is 32 bits and supports encoding up to two target instructions.
For instructions with more consumers than target fields, the compiler can build a fanout tree using move instructions or it can can assign high fanout instructions to broadcasts [15].
Broadcasts support sending an operand over a lightweight network to any number of consumer instructions in a block.
In Figure 1, when the TLEI (test-less-than-equalimmediate) instruction receives its single input operand from the ADD it will become ready and execute.
The test then produces a predicate operand that is broadcast on channel one (B[1P]) to all instructions listening on the broadcast channel, which in this example are the two predicated branch instructions (BRO.T and BRO.F).
The branch that receives a matching predicate will fire.

A spectrum of EDGE implementations are possible with various area and performance tradeoffs.
Prior EDGE research studied very wide issue implementations [12], [13], as well as fusion of multiple cores [14]–[16] to boost performance on scalar workloads.
In this work we focus on MPPAA scenarios utilizing compact EDGE soft processors with competitive performance/area.
Therefore data and pointers are 32 bits; blocks can be up to 32 instructions long; and the microarchitecture decodes 1-2 instructions per clock and issues one.
We further restrict the load-store queue (LSQ) in this study to a simple, non-speculative design and omit branch or memory dependence prediction.

454 :YAMAGUTIseisei:2018/08/06(月) 01:09:57.45 ID:FnAR0u04o ?2BP(3)
III.
EDGE IN AN FPGA

IF
INSN CACHE DATA
nK x 32 x 2 ports
BLOCK RAM

DC
DECODER(S)

IS
INSTRUCTION WINDOW
INSN SCHEDULER
32 ENTRIES

T1 T0 IID

DECODED INSNS
32 x n LUT-RAM(S)

OPERAND BUFFERS
32 x 32 LUT-RAMS

455 :YAMAGUTIseisei:2018/08/06(月) 01:11:14.79 ID:FnAR0u04o ?2BP(3)
EX
EX PIPELINE REGS

EX
TS

OPS0
32x32

×

LS
LOAD/STORE
QUEUE

DATA CACHE DATA
nK x 32
BLOCK RAM

LS PIPELINE REGS

×2

REGISTER FILE
32 x 32 LUT-RAM

Fig. 3:
Two decode, single issue EDGE microarchitecture.

456 :YAMAGUTIseisei:2018/08/06(月) 01:14:03.00 ID:FnAR0u04o ?2BP(3)
A.
Microarchitecture
Figure 3 is an example microarchitecure for a compact EDGE processor.
It has much in common with a conventional in-order scalar RISC: instruction and data caches and a five stage pipeline including instruction fetch (IF), decode (DC), operand fetch, execute (EX), and memory/data cache access (LS).
Unlike an in-order processor, instruction operands are read from operand buffers, not the register file; and the instruction to execute next, in dataflow order, is determined by the IS (issue) pipeline stage.
This employs an instruction window comprising a dataflow instruction scheduler, a decoded instructions buffer, and operand buffers.
It uses a simple loadstore queue to issue memory instructions in program order.
The front end (IF, DC) runs decoupled from the back end (IS, EX, LS).
It fetches and decodes two instructions per clock into the instruction window.
The instruction window's dataflow scheduler keeps the ready state of each decoded instruction's inputs i.e.
its predication and operands.
When all of its inputs (if any) are ready, the instruction wakes up and is ready to issue.
The lowest numbered ready instruction IID is selected each cycle and its decoded instruction and input operands are read.
Besides the data mux and function unit control signals, this instruction encodes up to two ready events.
The scheduler accepts these and/or events from other sources (muxed into T0 and T1) and updates the ready state of other instructions in the window.
Thus dataflow execution proceeds, starting with the block's ready 0-input instructions, then instructions that these target, and so forth.

457 :YAMAGUTIseisei:2018/08/06(月) 01:16:14.13 ID:FnAR0u04o ?2BP(3)
B.
EDGE dataflow instruction scheduling requirements
The instruction window and scheduler are the linchpin of the core.
Their area, clock period, capabilities, and limitations largely determine the realized performance of an EDGE core and the throughput of an EDGE multiprocessor.

The instruction scheduler has diverse functionality and requirements.
It is highly concurrent.
Each cycle the decoder(s) write instructions' decoded ready state and decoded instructions into the window.
Each cycle the scheduler selects the next instruction to issue, and in response the back end sends ready events --
either target ready events targeting a specific instruction's input slot (predicate, operand #0, operand #1) or broadcast ready events targeting all instructions waiting on a broadcast ID.
These set per-instruction active ready state bits which together with the decoded ready state may signal that the instruction is ready to issue.
Note the scheduler sometimes accepts events for target instructions which have not yet been decoded and must also inhibit reissue of issued ready instructions.
EDGE instructions may be non-predicated, or predicated true or false.
A predicated instruction does not become ready until it is targeted by another instruction's predicate result, and that result matches the predicate condition.
If the predicate doesn't match, the instruction never issues.

2

458 :YAMAGUTIseisei:2018/08/06(月) 01:19:27.57 ID:FnAR0u04o ?2BP(3)
On branch to a new block all instruction window ready state is flash cleared (block reset).
However when a block branches back to itself (block refresh) only active ready state is cleared; the decoded ready state is preserved so that it is not necessary to re-fetch and decode the block's instructions.
Refresh is key to saving time and energy in loops.
Since some software critical paths consist of a single chain of dependent instructions, i.e.
A targets B targets C, it is important that the dataflow scheduler add no pipeline bubbles for successive back-to-back instruction wakeup.
Therefore the IS-stage ready-issue-target-ready pipeline recurrence should complete in one cycle – assuming this does not severely impact clock frequency.
Instructions such as ADD have a latency of one cycle.
With EX-stage result forwarding the scheduler can wake their targets' instructions in the IS-stage, even before the instruction completes.
Other instruction results may await ALU comparisons, take multiple cycles, or have unknown latency.
These must wait until later to wake their targets.
Finally, the scheduler design should be scalable across a spectrum of anticipated EDGE implementations – each cycle accepting at least 1-4 decoded instructions and 2-4 target ready events, and issuing 1-2 instructions per cycle.
We consider two alternative dataflow instruction scheduler designs:
a brute-force parallel scheduler, where instructions' ready state is explicily represented in FPGA D-flip-flops (FFs), in which the ready status of every instruction is reevaluated each cycle;
and a more compact incremental scheduler which keeps ready state in LUT RAM and which updates ready status of only 2-4 target instructions per cycle.

459 :YAMAGUTIseisei:2018/08/06(月) 01:20:35.82 ID:FnAR0u04o ?2BP(3)
C.
A parallel instruction scheduler

BID
T1
T0
ENs

31
...
3
DBID DRT DRF DR0 DR1

NEXT RDYS
RDY RT RF R0 R1 INH

2
1
0

DEC.RDYS
RESET
RESET REFRESH

32→(5,1)
PRIORITY ENCODER

IID,V


Fig. 4:
Block diagram of a parallel dataflow scheduler, with entry #2 shown in more detail.

460 :YAMAGUTIseisei:2018/08/06(月) 01:22:51.58 ID:FnAR0u04o ?2BP(3)
Figure 4 depicts a parallel instruction scheduler for the instruction window of Figure 3.
Note the active ready state is set by target ready events T0, T1 and broadcast ID BID (if any), qualified by various input type enables ENs.
For a 32-entry window there are 32 instances of a one-instruction ready circuit.
In any cycle one or more of the 32 RDY signals may be asserted.
A 32-bit priority encoder reduces this to the 5-bit IID of the next instruction to issue.
For each entry there are six bits of decoded ready state,
i.e. they are initialized by the instruction decoder:

• DBID: 2-bit binary broadcast ID, or 00 if none
• DRT, DRF: decoder: predicate true (false) is ready
• DR0, DR1: decoder: operand #0 (operand #1) is ready

Together these bits encode whether the instruction has been decoded, awaits a predicate and/or some operand(s), perhaps via a broadcast channel, or is immediately ready to issue.
These bits are cleared on block reset only.
There are also six bits of active ready state: :

• RT, RF: predicate true (false) is ready
• R0, R1: operand #0 (operand #1) is ready
• INH: inhibit instruction – it has already issued
• RDY: instruction is ready to issue

3

461 :YAMAGUTIseisei:2018/08/06(月) 01:25:56.54 ID:FnAR0u04o ?2BP(3)
An instruction is ready iff (RT & RF & R0 & R1 & ~INH).
Any of RT, RF, R0, R1 may be set when:
its corresponding DRX is set by the decoder, or
an executing instruction targets that input, explicitly, or via a broadcast event (broadcast ID, input).
Active ready state bits are cleared on block reset or refresh.

Decoded ready state Active ready state
Instruction DBID DRT DRF DR0 DR1 RT RF R0 R1 INH RDY
READ 00 1 1 1 1 1 1 1 1 1 0
READ 00 1 1 1 1 1 1 1 1 0 1
ADD 00 1 1 0 0 1 1 1 0 0 0
TLEI 00 1 1 0 1 1 1 0 1 0 0
BRO.T B1 01 0 1 1 1 0 1 1 1 0 0
BRO.F B1 01 1 0 1 1 1 0 1 1 0 0
undecoded 00 0 0 x x 0 0 x x x 0

TABLE I: Example Instruction Scheduler Ready State

462 :YAMAGUTIseisei:2018/08/06(月) 01:28:42.21 ID:FnAR0u04o ?2BP(3)
Table I depicts a block's instruction scheduler state after decoding six instructions and issuing the first.
The first four non-predicated instructions have DRT and DRF set reflecting that they do not await any particular predicate results.
The two READ instructions, unpredicated and with zero input operands, are immediately ready to issue.
The first has issued – and so is now inhibited from reissue – targeting operand 0 of the ADD, whose R0 is now set.
The second READ will issue in the next IS pipeline cycle.
The TLEI (test-lessthan-or-equal-immediate) instruction broadcasts its predicate outcome on channel 1; the two branch instructions, predicated true (resp.
false), await this predicate result.
The seventh entry has not been decoded: (DRT|DRF)=0.
To reduce the critical path of dataflow scheduling, the front end writes predecoded EDGE instructions into the decoded instructions buffer.
As instruction IID issues, its decoded instruction is read by the back end.
Amongst other things it contains two target operand ready event fields, _T0 and _T1, which designate the 0-2 (IID, input) explicit targets of the instruction, as well as a 4-bit vector of input enables: ENs={RT EN, RF EN, R0 EN, R1 EN}.
Referring back to Figure 3, these signals are muxed with ready events from other pipeline stages into T0 and T1 input by the scheduler.

463 :YAMAGUTIseisei:2018/08/06(月) 01:36:06.91 ID:FnAR0u04o ?2BP(3)
>>459
Therefore the IS-stage ready-issue-target-ready pipeline recurrence should complete in one cycle -- assuming this does not severely impact clock frequency.
Finally, the scheduler design should be scalable across a spectrum of anticipated EDGE implementations -- each cycle accepting at least 1-4 decoded instructions and 2-4 target ready events, and issuing 1-2 instructions per cycle.

>>490
inhibit instruction -- it has already issued

464 :YAMAGUTIseisei:2018/08/06(月) 01:38:44.60 ID:FnAR0u04o ?2BP(3)
>>462
The first has issued -- and so is now inhibited from reissue -- targeting operand 0 of the ADD, whose R0 is now set.

465 :YAMAGUTIseisei:2018/08/06(月) 01:41:18.68 ID:FnAR0u04o ?2BP(3)
D.
FPGA implementation of the parallel scheduler
Careful attention to FPGA circuit design is required to minimize the area and clock period of the scheduler.
The 32instruction window requires 32*(6+6)=384 FFs for the ready state, and 32*many LUTs to decode ready events and update each entry's ready state.
A modern FPGA packs a set of LUTs (lookup tables) and D-flip-flops (FFs) together into a logic cluster.
For example, Xilinx 7 series devices group four 6-LUTs and eight FFs into each ``slice'' cluster.
Each LUT has two outputs and may be used as one 6-LUT, or two 5-LUTs with five common inputs.
Each output may be registered in a FF.
The flip-flops have optional CE (clock enable) and SR (set/reset) inputs but these signals are common to all eight FFs in the cluster.
This basic cluster architecture is similar in Altera FPGAs.
From this follows two design considerations.
Fracturable 6-LUT decoders: For target instruction index decoding, so long as indices are ≤5 bits, two decoders may fit into a single 6-LUT.
Slice FF packing and cluster control set restrictions: To minimize area and wire delays, the design packs the ready state FFs densely, 4-8 FFs per cluster.
Every 6-bit decoded ready state entry is written together (common RST and CE) and can pack into one or two slices.
More care is required for the active ready state FFs.
Each of these 32*6=192 FFs may be individually set, but by packing four FFs per slice, when one FF is clock enabled, all are clock enabled.
Whenever a FF is set by a ready event the other FFs in its slice should not change.
This requires implementing CE functionality in each FF's input LUT, feeding back its output into its input: FF NXT = FF | (EN & input).

466 :YAMAGUTIseisei:2018/08/06(月) 01:42:14.67 ID:FnAR0u04o ?2BP(3)
generate for (i = 0; i < N; i = i + 1) begin: R
always @* begin
// target decoders
T00[i] = T0 == i;
T01[i] = T0 == (i|N);
T10[i] = T1 == i;
T11[i] = T1 == (i|N);
B[i] = BID == DBID[i];

// next active ready state logic
RT_NXT[i] = RT[i] | DRT[i]
| (RT_EN & (T01[i]|T11[i]|B[i]));
RF_NXT[i] = RF[i] | DRF[i]
| (RF_EN & (T00[i]|T10[i]|B[i]));
R0_NXT[i] = R0[i] | DR0[i]
| (R0_EN & (T00[i]|T10[i]|B[i]));
R1_NXT[i] = R1[i] | DR1[i]
| (R1_EN & (T01[i]|T11[i]|B[i]));
INH_NXT[i] = INH[i] | (INH_EN & (IID == i));
RDY_NXT[i] = RT_NXT[i] & RF_NXT[i] & R0_NXT[i]
& R1_NXT[i] & ~INH_NXT[i];
end
end endgenerate

Listing 1: Parallel scheduler ``next readys'' logic

467 :YAMAGUTIseisei:2018/08/06(月) 01:46:49.27 ID:FnAR0u04o ?2BP(3)
Listing 1 is Verilog that generates the ``next readys'' for an N-entry parallel scheduler.
Although there are four ready event input types (predicate true, false, operand #0, operand #1),
by ensuring that predicate target events never occur in the same cycle as operand target events, a single target index bit suffices to distinguish false/operand #0 targets from true/operand #1 targets.
(Further decoding is provided by specific {RT/RF/R0/R1} ENs enables.) Therefore for an instruction window with N=32 entries, T0 and T1 are six bits {input#:1; IID:5}.
The target decoders T00, T01, T10, T11 (target-0input-0, etc.) are each one 6-LUT, as is the broadcast select decoder B.
The next active ready state logic folds together the target decoder outputs with current active and decoded ready state.
This requires another seven LUTs (two for INH NXT), for a total of 32*12 = 384 LUTs.
This may be improved by splitting the 32-entry scheduler into two 16-entry banks of even and odd instructions.
Within a bank a 4-bit bank-IID suffices.
Then T0, T1 narrow to five bits so T00, T01, T10, T11 fit in two 5,5-LUTs, and INH NXT in one 6-LUT, or 2*16*(3+6)=288 LUTs in all.

468 :YAMAGUTIseisei:2018/08/06(月) 01:51:35.42 ID:FnAR0u04o ?2BP(3)
>>467

4

469 :YAMAGUTIseisei:2018/08/06(月) 01:56:04.52 ID:FnAR0u04o ?2BP(3)
Besides shaving LUTs, a two bank scheduler provides two sets of T0, T1 ports and can sink two sets of two events each cycle.
This is essential to sustain wider issue rates of two instructions per cycle (which may target four operands per cycle).
Yet-wider issue and yet-larger instruction windows may even merit a four bank design.
The ready-issue-target-ready scheduling recurrence is the critical path of the IS-stage.
A 32→5 priority encoder reduces the RDY vector to an IID which selects the decoded instruction.
The decoded instruction's fields { T0, T1, BID, ENs} are muxed into {T0,T1,BID,ENs} which update target instructions' ready state, including RDY.
Priority encoder: Many 32-bit encoder designs were evaluated, including onehot conversion with LUT or carry-logic or-trees, carry-logic zero-scan, and F7MAP/F8MAP muxes.
The present design uses two 16→4 encoders, one per bank, which complete in two LUT delays.
In a one-issue processor, a subsequent 2:1 mux selects one of these encoder outputs.
In particular each 16-bit encoder input I[15:0] is chunked into I[15], I[14:10], I[9:5], I[4:0].
Each 5-bit group indexes a 32x4 LUT ROM with the precomputed encoder output for that group.
Together with three 5-bit zero comparator outputs, these feed a custom 4-bit 3:1 selector which outputs 'b1111 when all three groups are zero.
Technology mapping and floorplanning:
The design uses an RPM (relationally placed macro) methodology to improve area and interconnect delays and achieve a repeatable layout for easy routing and timing closure under module composition and massive replication.
Structural RTL instantiates modules and tiles them into a scheduler.
The XST annotation (*LUT MAP="yes"*) on a ≤6-input module locks its logic to one LUT; (*RLOC="XxYy"*) packs FPGA primitives into clusters and places clusters relative to each other.

470 :YAMAGUTIseisei:2018/08/06(月) 01:57:08.33 ID:FnAR0u04o ?2BP(3)
Fig. 5:
FPGA implementation of the parallel scheduler

Figure 5 is a Xilinx 7-series implementation of Figure 4, including the scheduler, priority encoder, and decoded instruction buffer, with the critical path highlighted in white.
Each two horizontal rows of FPGA slices correspond to four entries in the instruction window.
Left to right are:

• pale yellow: four 6-bit decoded ready state flip-flops;
• yellow/green: B, T00, T01, T10, T11 target decoders;
• orange: active ready state LUTs/FFs RT NXT/RT, etc.;
• purple: INH NXT and INH; • red: RDY NXT and RDY.

To the right are the synthesized priority encoders and muxes (blue) and the decoded instructions buffer (white) implemented in several 32x6-bit true dual port LUT RAMs.
Performance: In a Kintex-7 -1-speed grade, the critical path takes 5.0 ns, including RDY clock-to-out, priority encoder, mux, decoded instructions LUT RAM, next readys logic and RDY setup.
Interconnect delay is 85% of the critical path -- unfortunately all paths from any RDY to any RDY must traverse a relatively large diameter netlist.
Cycle time may be reduced to 2.9 ns by adding a pipeline register halfway through the scheduler critical path (the output port of the instruction buffer LUT RAM)
however this will not achieve back-to-back issue (in successive cycles) of a single dependent chain of instructions.

471 :YAMAGUTIseisei:2018/08/06(月) 02:01:58.00 ID:FnAR0u04o ?2BP(3)
E.
Incremental dataflow scheduler ready state
The parallel scheduler is straightforward but it consumes hundreds of LUTs and FFs just to maintain 32x12b of ready state -- a few LUTs worth of LUT RAM -- and this area doubles as the instruction window size doubles.
Also, each cycle its next readys LUTs recompute the readiness of every instruction, even though (broadcast notwithstanding) each issued instruction affects at most two others' ready state.
In contrast, the incremental scheduler keeps decoded and active ready state in LUT RAM, maintains the frontier of ready instructions in queues, and evaluates the ready status of just 2-4 target instructions per cycle.


5


Compared to an array of FFs, LUT RAM is fast and dense but has some shortcomings: there is no way to flash clear it, and it only supports one write per cycle.

472 :YAMAGUTIseisei:2018/08/06(月) 02:04:55.29 ID:FnAR0u04o ?2BP(3)
DRDYSS
WA ← DC_IID
RA ← EVT_IID
I ← DC_DRDYS
O → READY LOGIC DRDYS

ARDYSS
WA ← EVT_IID
RA ← EVT_IID
I ← READYLOGIC ARDYS_NXT
O → READYLOGIC DRDYS

DVS ← RESET
O → READYLOGIC DV
WA ← DRDYSS WA
RA ← DRDYSS RA

AVS ← RESETvREFRESH
WA ← ARDYSS WA
RA ← ARDYSS RA
O → READYLOGIC AV

READY LOGIC
READY →
DV ← DVS O
DRDYS ← DRDYSS O
AV ← AVS O
ARDYS → ARDYSS O
ARDYS_NXT → ARDYSS I
EVT_RDYS ← EVT_RDYS

(a) Design: ready state, validation, and ready logic.

473 :YAMAGUTIseisei:2018/08/06(月) 02:07:43.81 ID:FnAR0u04o ?2BP(3)
(b) FPGA implementation.

Fig. 6:
A 16-entry scheduler bank.

474 :YAMAGUTIseisei:2018/08/06(月) 02:09:43.22 ID:FnAR0u04o ?2BP(3)
// ready logic
always @* begin
ARDYS_NXT = (DV ? DRDYS : 4'b0000)
| (AV ? ARDYS : 4'b0000)
| EVT_RDYS;
READY = &ADRYS_NXT;
end

Listing 2: Ready logic

475 :YAMAGUTIseisei:2018/08/06(月) 02:12:33.24 ID:FnAR0u04o ?2BP(3)
Instead, the scheduler uses a hybrid of LUT RAM and FF ``RAM''.
Decoded (DRT, DRF, DR0, DR1) and active (RT, RF, R0, R1) ready state are stored in several banks of 16x4 true dual port LUT RAM, which is validated by a 16x1 flashclearable-set-only-RAM ``FC-SO-RAM''.
This comprises 16 FFs (with common reset), 16 write port address decoders (eight 5,5-LUTs), and a 16:1 read port mux (four 6-LUTs, two MUXF7s, one MUXF8) -- just three slices in all.
Each read from this hybrid reads the 4b LUT RAM entry and its valid bit.
Each write updates the LUT RAM and sets its valid bit.
Multiple LUT RAM write ports.
To sustain a fetch/decode rate of d instructions/cycle, and an issue rate of i instructions/cycle, it is necessary to update d+2i ready state entries each cycle.
This is a challenge for one write/cycle LUT RAM.
Rather than use clock doubling or replicated RAM banks with live value tables, the incremental scheduler divides the ready state up into four (or more) interleaved, disjoint banks:

(decoded, active) ready state for (even, odd) instructions.
Then the front end can write even and odd decoded ready state while the back end updates even and/or odd target instructions' active ready state.
Figure 6 shows the resulting a 16-entry scheduler bank design and implementation.
The blue decoded and active ready state LUT RAMs DRDYSS and ARDYSS are validated by orange/red FC-SO-RAMs DVS and AVS.
Each cycle the decoder writes instruction DC IID's decoded ready state DC DRDYS and its valid bit.
Also each cycle the bank's target ready event EVT ::={EVT IID; EVT RDYS} is processed via a read-modify-write of EVT IID's ARDYS with its DRDYS and EVT RDYS.
See Listing 2.
The instruction is ready when all four ARDYS bits are set.
All of this logic (cyan) requires just one slice; as an optimization READY's and-reduction is in carry-logic.

476 :YAMAGUTIseisei:2018/08/06(月) 02:13:32.42 ID:FnAR0u04o ?2BP(3)
Note that the EDGE compiler does not guarantee that both targets of an instruction are in disjoint banks so there may be scheduler bank conflicts.
An ADD instruction might target an operand of instruction 10 and an operand of instruction 12.
Since it is not possible to update the active ready state of the two even bank targets in the same cycle, one event is processed and the other is queued for a later cycle.

477 :YAMAGUTIseisei:2018/08/06(月) 02:17:11.29 ID:FnAR0u04o ?2BP(3)
F.
Incremental dataflow scheduler design, operation, and implementation
The core of the scheduler (Figure 7) consists of:
• INSN: decoded instruction with two target event fields
• EVT0, EVT1: even/odd pending event registers
• even/odd event muxes, controlled by predecoded selects
• SCH0, SCH1: even/odd 16-entry scheduler banks
• three ready instruction IID queues:
-- DCRDYQ: decoder ready queue;
-- ISRDYQ: issue (scheduler) ready queue;
-- LSRDYQ: load/store ready queue
• two 3:1 selectors to select next IID
• INSNS: decoded instructions RAM (read port)
Note, in this design, the scheduler recurrence cycle begins and ends with the decoded instruction register.
Now consider execution of the first EDGE code block in Figure 1.
The scheduler is reset, clearing DVS, AVS in SCH0, SCH1.
The front end fetches the block's header and fetches and decodes its instructions into INSNS.
The two READs are ready to issue so their IIDs are enqueued on DCRDYQ.
This ``primes the pump'' for the back end.
The other instructions await operands or predicates, are not ready, so are not enqueued.

6

478 :YAMAGUTIseisei:2018/08/06(月) 02:20:36.99 ID:FnAR0u04o ?2BP(3)
0
INSN
T1
T0

1
EVT1
EVT0

2 3 4
LSRDYQ
DCRDYQ
ISRDYQ
SCH1
READY →
EVT ←
EVT_IID →
SCH0
READY →
EVT ←
EVT_IID →

5
IID

6
INSNS:
DECODED INSTRUCTIONS
32xn LUT RAM

(a) Design.

479 :YAMAGUTIseisei:2018/08/06(月) 02:21:24.24 ID:FnAR0u04o ?2BP(3)
(b) FPGA implementation.

Fig. 7:
32-entry scheduler, decoded instructions buffer, and ready queues.

480 :YAMAGUTIseisei:2018/08/06(月) 02:24:31.08 ID:FnAR0u04o ?2BP(3)
Back end dataflow execution proceeds as follows.
Initially INSN is invalid and both READYs are negated.
The IID selector tree selects/dequeues the first READ instruction (IID=0) from DCRDYQ.
The decoded READ instruction word is read from INSNS into INSN.

The READ targets ADD operand #1.
Its INSN.T0 (even bank target ready event) field is valid and its mux selects EVT=(2,'b0001) for SCH0.
That updates ADD's active ready state: 'b1100|'b0000|'b0001='b1101, now awaiting only the left operand (operand #0).
Since neither scheduler bank found a READY instruction, the IID selector tree selects/dequeues the second READ from DCRDYQ.

481 :YAMAGUTIseisei:2018/08/06(月) 02:26:44.11 ID:FnAR0u04o ?2BP(3)
This READ targets ADD operand #0;
its INSN.T0 is EVT=(2,'b0010).
SCH0 updates ADD's ready state to 'b1111 and asserts READY so the ADD (IID=2) issues.
ADD's T1 targets the TLEI ready state in SCH1.
TLEI becomes ready and issues.
As for TLEI, neither T0/T1 fields designate IS-stage ready events.
Why? Unlike simple one cycle latency instructions like ADD, test instructions' targets cannot receive ready events until the test executes in the EX-stage.
Once a test completes, its true/false predicate event(s) are signaled.
These proceed through queues and/or muxes (not shown) to the EVT0, EVT1 pending event registers, awaiting idle scheduler event slots.

482 :YAMAGUTIseisei:2018/08/06(月) 02:27:11.41 ID:FnAR0u04o ?2BP(3)
Queues:
The design employs many elastic FIFO ready queues and event queues.
They are small and fast, built with up-down counters and Xilinx SRL32CE 32-bit variable length shift register LUTs.
Besides DCRDYQ the present design has two other ready queues.
ISRDYQ:
In a ``one issue'' design when an instruction issues and it targets and wakes two others, the even instruction issues next, and the odd one is queued on ISRDYQ.
LSRDYQ:
EDGE processors use a load-store queue to provide sequential memory semantics.
One simple area-optimized LSQ defers and reorders certain accesses; then when a (ready) load/store becomes issuable to memory the LSQ enqueues it on LSRDYQ.
Broadcast wakeup:
Each EDGE result broadcast may target and wake an arbitrary number of instructions in the window.
This is easy for a parallel scheduler but costly for an incremental one.
When a result is broadcast the scheduler must sequentially update the ready state of every decoded instruction with that broadcast input.
Therefore the decoder maintains queues (BR1Q, BR2Q, BR3Q) of IIDs of instructions with a given broadcast input.
Once a broadcast result is known, the scheduler begins to dequeue the BRnQ IIDs into EVTs presented to SCH0, SCH1.
Performance:
The labels 0-6 in Figure 7a depict the number of ``LUT delays'' to each point in the scheduler critical path, the white path in Figure 7b.
In a Kintex-7 -1-speed grade, this takes 4.3 ns including INSN clock-to-out, EVT mux, SCH1's AVS read port mux, ARDYS NXT and READY logic, IID selector, INSNS read, and INSN setup.
Here interconnect delay is just 70% of the critical path reflecting relatively shorter nets and some use of LUT-local MUXF7/MUXF8/CARRY4 nets.
The scheduler clock period may be reduced to 2.5 ns by adding pipeline registers after the LUT RAM and FC-SO-RAM reads, but as with the parallel scheduler, pipelining precludes backto-back issue of dependent instructions.

483 :YAMAGUTIseisei:2018/08/06(月) 02:28:25.23 ID:FnAR0u04o ?2BP(3)
G.
Comparing parallel and incremental schedulers
Table II summarizes the differences between the two dataflow scheduler designs.
The core of the incremental scheduler is less than a third the size of the parallel scheduler although the size advantage is smaller when the additional overhead of queues and muxes is added.
The incremental scheduler is also faster and the area*period metric is 2.6x better.

7

484 :YAMAGUTIseisei:2018/08/06(月) 02:29:34.41 ID:FnAR0u04o ?2BP(3)
Metric Parallel Incremental Units

Area, 32 entries 288 78 LUTs
Area, total, 32 entries 340 150 LUTs
Period 5.0 4.3 ns
Period, pipelined 2.9 2.5 ns
Area, total * period 1700 645 LUT*ns

Broadcast flash iterative
Event bank conflicts? never sometimes

Area, 4 events/cycle 288 156 LUTs
Area, 64 entries 576 130 LUTs

TABLE II:
Comparing parallel and incremental schedulers.


However the parallel scheduler retains some brute force advantages.
It can process a broadcast event in a single cycle, whereas the incremental scheduler must iteratively drain a broadcast queue at a rate of 1-2 instructions per cycle.
This may cause issue stalls in some workloads.
The incremental scheduler is also subject to even/odd target bank conflicts which may delay an instruction wake up.
Real workload studies are needed to measure whether these effects overshadow its substantial area*period advantage.
Finally consider future scale up to wider issue and larger instruction windows.
The parallel scheduler does not grow when subdivided into more banks to process twice as many events per cycle, whereas the incremental scheduler core area doubles.
To grow the instruction window to 64 entries, the parallel scheduler require twice as much area, whereas the incremental scheduler area grows more modestly.

485 :YAMAGUTIseisei:2018/08/06(月) 02:30:24.21 ID:FnAR0u04o ?2BP(3)
IV.
CONCLUSION
This paper presents our work towards a practical out-oforder dataflow soft processor architecture for FPGAs.
We set out to discover whether a novel EDGE instruction set architecture, optimized for simpler high ILP microarchitectures in ASICs, is also a good fit for FPGAs, or whether general purpose soft processors will remain stuck in the scalar RISC slow lane.
We considered two different dataflow instruction scheduler designs and their optimized FPGA implementations.
In the context of commercial 200 MHz, 1,000-2,000 LUT soft processors, the limited FPGA resource cost and clock period impact of either design seems acceptable and practical.
Both design alternatives will scale well to future four-decode/twoissue implementations.

REFERENCES
[1] J. Gray,
``Homebrewing RISCs in FPGAs,'' http://fpgacpu.org/papers/j32.ppt, August 1996.
[2] ---- ,
``Building a RISC System in an FPGA,''
Circuit Cellar Ink, no. 116 -- 118, March, April, May 2000.
[Online]. Available:
http://fpgacpu.org/papers/xsoc-series-drafts.pdf
[3] Altera Corporation,
``Nios Embedded Processor Software Development Reference Manual,'' March 2001.
[4] Xilinx Inc., ``MicroBlaze Processor Reference Guide,''
2002.
[5] A. K. Jones, R. Hoare, D. Kusic, J. Fazekas, and J. Foster,
``An FPGAbased VLIW Processor with Custom Hardware Execution,''
in Proceedings of the 13th International Symposium on Field-programmable Gate Arrays, 2005, pp. 107 -- 117.
[6] K. O. I. Tili and J. G. Steffan,
``TILT: A Multithreaded VLIW Soft Processor Family,''
in Proceedings of the International Conference on Field Programmable Logic and Applications, August 2013.

486 :YAMAGUTIseisei:2018/08/06(月) 02:35:01.27 ID:FnAR0u04o ?2BP(3)
[7] P. Yiannacouras, J. G. Steffan, and J. Rose,
``VESPA: Portable, Scalable, and Flexible FPGA-based Vector Processors,''
in Proceedings of the International Conference on Compilers, Architectures and Synthesis for Embedded Systems, 2008, pp. 61 -- 70.
[8] J. Yu, G. Lemieux, and C. Eagleston,
``Vector Processing As a Soft-core CPU Accelerator,'' in Proceedings of the 16th International ACM/SIGDA Symposium on Field Programmable Gate Arrays, 2008, pp. 222 -- 232.
[9] R. Carli,
``Flexible MIPS Soft Processor Architecture,''
Master's thesis, Massachusetts Institute of Technology, May 2008.
[10] K. Aasaraai and A. Moshovos,
``Towards a viable out-of-order soft core: Copy-free, checkpointed register renaming,''
in Proceedings of the 19th International Conference on Field Programmable Logic and Applications, August 2009.
[11] B. H. Dwiel, N. K. Choudhary, and E. Rotenberg,
``FPGA Modeling of Diverse Superscalar Processors,''
in Proceedings of the IEEE International Symposium on Performance Analysis of Systems and Software, 2012, pp. 188 -- 199.
[12] D. Burger, S. W. Keckler, K. S. McKinley, M. Dahlin, L. K. John, C. Lin, C. R. Moore,
J. Burrill, R. G. McDonald, W. Yoder, X. Chen, R. Desikan, S. Drolia, J. Gibson, M. S. S. Govindan,
P. Gratz, H. Hanson, C. Kim, S. K. Kushwaha, H. Liu, R. Nagarajan, N. Ranganathan,
E. Reeber, K. Sankaralingam, S. Sethumadhavan, P. Sivakumar, and A. Smith,
``Scaling to the End of Silicon with EDGE architectures,'' IEEE Computer, vol. 37, no. 7, pp. 44 -- 55, July 2004.

487 :YAMAGUTIseisei:2018/08/06(月) 02:35:26.75 ID:FnAR0u04o ?2BP(3)
[13] M. Gebhart, B. A. Maher, K. E. Coons, J. Diamond, P. Gratz, M. Marino, N. Ranganathan, B. Robatmili, A. Smith, J. Burrill, S. W. Keckler, D. Burger, and K. S. McKinley,
``An Evaluation of the TRIPS Computer System,''
in Proceedings of the 14th International Conference on Architectural Support for Programming Languages and Operating Systems, 2009, pp. 1 -- 12.
[14] C. Kim, S. Sethumadhavan, M. S. Govindan, N. Ranganathan, D. Gulati, D. Burger, and S. W. Keckler,
``Composable lightweight processors,''
in Proceedings of the 40th International Symposium on Microarchitecture, 2007, pp. 381 -- 394.
[15] B. Robatmili, D. Li, H. Esmaeilzadeh, S. Govindan, A. Smith, A. Putnam, D. Burger, and S. W. Keckler,
``How to Implement Effective Prediction and Forwarding for Fusable Dynamic Multicore Architectures,''
in Proceedings of the 19th International Symposium on High Performance Computer Architecture, 2013, pp. 460 -- 471.
[16] M. S. S. Govindan, B. Robatmili, D. Li, B. Maher, A. Smith, S. W. Keckler, and D. Burger,
``Scaling power and performance via processor composability,''
IEEE Transactions on Computers, March 2013.

8

488 :YAMAGUTIseisei:2018/08/26(日) 16:15:24.18 ID:CL1hr8qnX ?2BP(3)
>156 ー 180824 1738 3eCkMSqb
:
>http://thenextweb.com/artificial-intelligence/2018/08/23/researchers-gave-ai-curiosity-and-it-played-video-games-all-day/
:
> AIに好奇心を実装、
>
>Open AI―イーロンマスクが共同出資したシンギュラリティ専門のシンクタンク―は好奇心による学習という広範な研究 ry ( http://pathak22.github.io/large-scale-curiosity/resources/largeScaleCuriosity2018.pdf
:

489 :YAMAGUTIseisei:2018/08/26(日) 16:18:00.66 ID:CL1hr8qnX ?2BP(3)
Page 1

Large-Scale Study of Curiosity-Driven Learning


Yuri Burda∗
OpenAI
Harri Edwards∗
OpenAI
Deepak Pathak∗
UC Berkeley
Amos Storkey
Univ. of Edinburgh
Trevor Darrell
UC Berkeley
Alexei A. Efros
UC Berkeley


Abstract

Reinforcement learning algorithms rely on carefully engineering environment rewards that are extrinsic to the agent.
However, annotating each environment with hand-designed, dense rewards is not scalable, motivating the need for developing reward functions that are intrinsic to the agent.
Curiosity is a type of intrinsic reward function which uses prediction error as reward signal.
In this paper: (a) We perform the first large-scale study of purely curiosity-driven learning, i.e. without any extrinsic rewards, across 54 standard benchmark environments, including the Atari game suite.
Our results show surprisingly good performance, and a high degree of alignment between the intrinsic curiosity objective and the hand- designed extrinsic rewards of many game environments.
(b) We investigate the effect of using different feature spaces for computing prediction error and show that random features are sufficient for many popular RL game benchmarks,
but learned features appear to generalize better (e.g. to novel game levels in Super Mario Bros.).
(c) We demonstrate limitations of the prediction-based rewards in stochastic setups.
Game-play videos and code are at
http://pathak22.github.io/large-scale-curiosity/
.

490 :YAMAGUTIseisei:2018/08/26(日) 16:20:05.55 ID:CL1hr8qnX ?2BP(3)
1
Introduction

Reinforcement learning (RL) has emerged as a popular method for training agents to perform complex tasks.
In RL, the agent policy is trained by maximizing a reward function that is designed to align with the task.
The rewards are extrinsic to the agent and specific to the environment they are defined for.
Most of the success in RL has been achieved when this reward function is dense and well-shaped, e.g., a running `` score '' in a video game [21].
However, designing a well-shaped reward function is a notoriously challenging engineering problem.
An alternative to `` shaping '' an extrinsic reward is to supplement it with dense intrinsic rewards [26], that is, rewards that are generated by the agent itself.
Examples of intrinsic reward include `` curiosity '' [11, 22, 27, 35, 40] which uses prediction error as reward signal, and `` visitation counts '' [3, 20, 24, 30] which discourages the agent from revisiting the same states.
The idea is that these intrinsic rewards will bridge the gaps between sparse extrinsic rewards by guiding the agent to efficiently explore the environment to find the next extrinsic reward.

491 :YAMAGUTIseisei:2018/08/26(日) 16:20:46.49 ID:CL1hr8qnX ?2BP(3)
But what about scenarios with no extrinsic reward at all? This is not as strange as it sounds.
Developmental psychologists talk about intrinsic motivation (i.e., curiosity) as the primary driver in the early stages of development [32, 41]: babies appear to employ goal-less exploration to learn skills that will be useful later on in life.
There are plenty of other examples, from playing Minecraft to visiting your local zoo, where no extrinsic rewards are required.
Indeed, there is evidence that pre-training an agent on a given environment using only intrinsic rewards allows it to learn much faster when fine-tuned to a novel task in a novel environment [27, 28].
Yet, so far, there has been no systematic study of learning with only intrinsic rewards.

*Alphabetical ordering; the first three authors contributed equally.

Preprint.
Work in progress.

492 :YAMAGUTIseisei:2018/08/26(日) 16:22:09.28 ID:CL1hr8qnX ?2BP(3)
Page 2

Figure 1:
A snapshot of the 54 environments investigated in the paper.
We show that agents are able to make progress using no extrinsic reward, or end-of-episode signal, and only using curiosity.
Video results, code and models at
http://pathak22.github.io/large-scale-curiosity/
.

493 :YAMAGUTIseisei:2018/08/26(日) 16:23:45.97 ID:CL1hr8qnX ?2BP(3)
In this paper, we perform a large-scale empirical study of agents driven purely by intrinsic rewards across a range of diverse simulated environments.
In particular, we choose the dynamics-based curiosity model of intrinsic reward presented in Pathak et al. [27] because it is scalable and trivially parallelizable, making it ideal for large-scale experimentation.
The central idea is to represent intrinsic reward as the error in predicting the consequence of the agent 's action given its current state, i.e., the prediction error of learned forward-dynamics of the agent.
We thoroughly investigate the dynamics-based curiosity across 54 environments: video games, physics engine simulations, and virtual 3D navigation tasks, shown in Figure 1.

To develop a better understanding of curiosity-driven learning, we further study the crucial factors that determine its performance.
In particular, predicting future state in high dimensional raw observation space (e.g., images) is a challenging problem and, as shown by recent works [27, 42], learning dynamics in an auxiliary feature space leads to improved results.
However, how one should choose such an embedding space is a critical, yet open research problem.
Through a systematic ablation, we examine the role of different ways to encode agent 's observation such that an agent can perform well driven purely by its own curiosity.

494 :YAMAGUTIseisei:2018/08/26(日) 16:27:38.51 ID:CL1hr8qnX ?2BP(3)
To ensure stable online training of dynamics, we argue that the desired embedding space should: (a) be compact in terms of dimensionality,
(b) preserve sufficient information about the observation, and (c) be a stationary function of the observations.
We show that encoding observations via a random network turn out to be a simple, yet effective technique for modeling curiosity across many popular RL benchmarks.
This might suggest that many popular RL video game test-beds are not as visually sophisticated as commonly thought.
Interestingly, we discover that although random features are sufficient for good performance at training, the learned features appear to generalize better (e.g., to novel game levels in Super Mario Bros.).

In summary:
(a) We perform a large-scale study of curiosity-driven exploration across a variety of environments including:
the set of Atari games [4], Super Mario Bros., virtual 3D navigation in Unity [1], multi-player Pong, and Roboschool [39] environments.
(b) We extensively investigate different feature spaces for learning the dynamics-based curiosity: random features, pixels, inverse- dynamics [27] and variational auto-encoders [15] and evaluate generalization to unseen environments.
(c) We conclude by discussing some limitations of a direct prediction-error based curiosity formulation.
We observe that if the agent itself is the source of stochasticity in the environment, it can reward itself without making any actual progress.
We empirically demonstrate this limitation in a 3D navigation task where the agent controls different parts of the environment.

2

495 :YAMAGUTIseisei:2018/08/26(日) 16:30:44.18 ID:CL1hr8qnX ?2BP(3)
Page 3

2
Dynamics-based Curiosity-driven Learning

Consider an agent that sees an observation xt, takes an action at and transitions to the next state with observation xt+1.
We want to incentivize this agent with a reward rt relating to how informative the transition was.
To provide this reward, we use an exploration bonus involving the following elements:
(a) a network to embed observations into representations φ(x),
(b) a forward dynamics network to predict the representation of the next state conditioned on the previous observation and action p(φ(xt+1)|xt,at).
Given a transition tuple {xt,xt+1,at}, the exploration reward is then defined as rt = ? log p(φ(xt+1)|xt,at), also called the surprisal [2].
An agent trained to maximize this reward will favor transitions with high prediction error, which will be higher in areas where the agent has spent less time, or in areas with complex dynamics.
Such a dynamics-based curiosity has been shown to perform quite well across scenarios [27] especially when the dynamics are learned in an embedding space rather than raw observations.
In this paper, we explore dynamics-based curiosity and use mean-squared error corresponding to a fixed-variance Gaussian density as surprisal, i.e., f(xt,at) ? φ(xt+1)2 2 where f is the learned dynamics model.
However, any other density model could be used.

2.1
Feature spaces for forward dynamics
Consider the representation φ in the curiosity formulation above.
If φ(x) = x, the forward dynamics model makes predictions in the observation space.
A good choice of feature space can make the prediction task more tractable and filter out irrelevant aspects of the observation space.
But, what makes a good feature space for dynamics driven curiosity? We narrow down a few qualities that a good feature space should have:

496 :YAMAGUTIseisei:2018/08/26(日) 16:33:46.18 ID:CL1hr8qnX ?2BP(3)
• Compact: The features should be easy to model by being low(er)-dimensional and filtering out irrelevant parts of the observation space.
• Sufficient: The features should contain all the important information.
Otherwise, the agent may fail to be rewarded for exploring some relevant aspect of the environment.
• Stable: Non-stationary rewards make it difficult for reinforcement agents to learn.
Exploration bonuses by necessity introduce non-stationarity since what is new and novel becomes old and boring with time.
In a dynamics-based curiosity formulation, there are two sources of non-stationarity: the forward dynamics model is evolving over time as it is trained and the features are changing as they learn.
The former is intrinsic to the method, and the latter should be minimized where possible

In this work, we systematically investigate the efficacy of a number of feature learning methods, summarized briefly as follows:

Pixels
The simplest case is where φ(x) = x and we fit our forward dynamics model in the observation space.
Pixels are sufficient, since no information has been thrown away, and stable since there is no feature learning component.
However, learning from pixels is tricky because the observation space may be high-dimensional and complex.

Random Features (RF)
The next simplest case is where we take our embedding network, a convolutional network, and fix it after random initialization.
Because the network is fixed, the features are stable.
The features can be made compact in dimensionality, but they are not constrained to be.
However, random features may fail to be sufficient.

497 :YAMAGUTIseisei:2018/08/26(日) 16:36:04.77 ID:CL1hr8qnX ?2BP(3)
Variational Autoencoders (VAE) VAEs were introduced in [15, 31] to fit latent variable generative models p(x, z) for observed data x and latent variable z with prior p(z) using variational inference.
The method calls for an inference network q(z|x) that approximates the posterior p(z|x).
This is a feedforward network that takes an observation as input and outputs a mean and variance vector describing a Gaussian distribution with diagonal covariance.


VAE IDF RF Pixels
Stable No No Yes Yes
Compact Yes Yes Maybe No
Sufficient Yes Maybe Maybe Yes

Table 1:
Table summarizing the categorization of different kinds of feature spaces considered.

3


Page 4

We can then use the mapping to the mean as our embedding network φ.
These features will be a low-dimensional approximately sufficient summary of the observation,
but they may still contain some irrelevant details such as noise, and the features will change over time as the VAE trains.

498 :YAMAGUTIseisei:2018/08/26(日) 16:37:31.56 ID:CL1hr8qnX ?2BP(3)
Inverse Dynamics Features (IDF) Given a transition (st,st+1,at) the inverse dynamics task is to predict the action at given the previous and next states st and st+1.
Features are learned using a common neural network φ to first embed st and st+1.
The intuition is that the features learned should correspond to aspects of the environment that are under the agent 's immediate control.
This feature learning method is easy to implement and in principle should be invariant to certain kinds of noise (see [27] for a discussion).
A potential downside could be that the features learned may not be sufficient, that is they do not represent important aspects of the environment that the agent cannot immediately affect.
A summary of these characteristics is provided in Table 1.
Note that the learned features are not stable because their distribution changes as learning progresses.
One way to achieve stability could be to pre-train VAE or IDF networks.
However, unless one has access to the internal state of the game, it is not possible to get a representative data of the game scenes to train the features.
One way is to act randomly to collect data, but then it will be biased to where the agent started, and won't generalize further.
Since all the features involve some trade-off of desirable properties, it becomes an empirical question as to how effective each of them is across environments.

2.2
Practical considerations in training an agent driven purely by curiosity
Deciding upon a feature space is only first part of the puzzle in implementing a practical system.
Here, we detail the critical choices we made in the learning algorithm.
Our goal was to reduce non-stationarity in order to make learning more stable and consistent across environments.
Through the following considerations outlined below, we are able to get exploration to work reliably for different feature learning methods and environments with minimal changes to the hyper-parameters.

499 :YAMAGUTIseisei:2018/08/26(日) 16:38:36.96 ID:CL1hr8qnX ?2BP(3)
• PPO.
In general, we have found PPO algorithm [38] to be a robust learning algorithm that requires little hyper-parameter tuning, and hence, we stick to it for our experiments.
• Reward normalization.
Since the reward function is non-stationary, it is useful to normalize the scale of the rewards so that the value function can learn quickly.
We did this by dividing the rewards by a running estimate of the standard deviation of the sum of discounted rewards.
• Advantage normalization.
While training with PPO, we normalize the advantages [46] in a batch to have a mean of 0 and a standard deviation of 1.
• Observation normalization.
We run a random agent on our target environment for 10000 steps, then calculate the mean and standard deviation of the observation and use these to normalize the observations when training.
This is useful to ensure that the features do not have very small variance at initialization and to have less variation across different environments.
• More actors.
The stability of the method is greatly increased by increasing the number of parallel actors (which affects the batch-size) used.
We typically use 128 parallel runs of the same environment for data collection while training an agent.
• Normalizing the features.
In combining intrinsic and extrinsic rewards, we found it useful to ensure that the scale of the intrinsic reward was consistent across state space.
We achieved this by using batch-normalization [13] in the feature embedding network.

500 :YAMAGUTIseisei:2018/08/26(日) 16:41:37.25 ID:CL1hr8qnX ?2BP(3)
2.3
` Death is not the end ' : discounted curiosity with infinite horizon
One important point is that the use of an end of episode signal, sometimes called a ` done ' , can often leak information about the true reward function.
If we don't remove the ` done ' signal, many of the Atari games become too simple.
For example, a simple strategy of giving +1 artificial reward at every time-step when the agent is alive and 0 on death is sufficient to obtain a high score in some games,
for instance, the Atari game ` Breakout ' where it will seek to maximize the episode length and hence its score.
In the case of negative rewards, the agent will try to end the episode as quickly as possible.

4

501 :YAMAGUTIseisei:2018/08/26(日) 16:42:21.10 ID:CL1hr8qnX ?2BP(3)
Page 5


0 100 200 300 400
0 500 1000 1500 2000 2500 3000 3500 4000

BeamRider
BreakOut
MontezumaRevenge
Pong
Mario
Qbert
Reverraid
Seaquest
SpaceInvaders

Frames (millions)
Extrinsic Reward per Episode

Pixels
VAE features
Inverse Dynamics features
Random CNN features

Figure 2:
A comparison of feature learning methods on 8 selected Atari games and the Super Mario Bros.

502 :YAMAGUTIseisei:2018/08/26(日) 16:43:13.06 ID:CL1hr8qnX ?2BP(3)
These evaluation curves show the mean reward (with standard error) of agents trained purely by curiosity, without reward or an end-of-episode signal.
We see that our purely curiosity-driven agent is able to gather rewards in these environments without using any extrinsic reward at training.
Results on all of the Atari games are in the appendix in Figure 8.
We find curiosity model trained on pixels does not work well across any environment and VAE features perform either same or worse than random and inverse dynamics features.
Further, inverse dynamics-trained features perform better than random features in 55% of the Atari games.
An interesting outcome of this analysis is that random features for modeling curiosity are a simple, yet surprisingly strong baseline and likely to work well in half of the Atari games.


In light of this, if we want to study the behavior of pure exploration agents, we should not bias the agent.
In the infinite horizon setting (i.e., the discounted returns are not truncated at the end of the episode and always bootstrapped using the value function), death is just another transition to the agent, to be avoided only if it is boring.
Therefore, we removed ` done ' to separate the gains of an agent 's exploration from merely that of the death signal.
In practice, we do find that the agent avoids dying in the games since that brings it back to the beginning of the game, an area it has already seen many times and where it can predict the dynamics well.
This subtlety has been neglected by previous works showing experiments without extrinsic rewards.

503 :YAMAGUTIseisei:2018/08/26(日) 16:47:17.14 ID:CL1hr8qnX ?2BP(3)
3
Experiments

In all of our experiments, both the policy and the embedding network work directly from pixels.
For our implementation details including hyper-parameters and architectures, please refer to the Appendix A.
Unless stated otherwise, all curves are the average of three runs with different seeds, and the shaded areas are standard errors of the mean.
We have released the code and videos of a purely curious agent playing across all environments on the website 2.

3.1
Curiosity-driven learning without extrinsic rewards We begin by scaling up a pure curiosity-driven learning to a large number of environments without using any extrinsic rewards.
We pick a total of 54 diverse simulated environments, as shown in Figure 1,
including 48 Atari games, Super Mario Bros., 2 Roboschool scenarios (learning Ant controller and Juggling), Two-player Pong, 2 Unity mazes (with and without a TV controlled by the agent).
The goal of this large-scale analysis is to investigate the following questions:
(a) What actually happens when you run a pure curiosity-driven agent on a variety of games without any extrinsic rewards?
(b) What kinds of behaviors can you expect from these agents? (c) What is the effect of the different feature learning variants in dynamics-based curiosity on these behaviors?

2
http://pathak22.github.io/large-scale-curiosity/

5

504 :YAMAGUTIseisei:2018/08/26(日) 16:48:28.26 ID:CL1hr8qnX ?2BP(3)
Page 6

A) Atari Games
To answer these questions, we began with a collection of well-known Atari games and ran a suite of experiments with different feature learning methods.
One way to measure how well a purely curious agent performs is to measure the extrinsic reward it is able to achieve, i.e. how good is the agent at playing the game.
We show the evaluation curves of mean extrinsic reward in on 8 common Atari games in Figure 2 and all 48 Atari suite in Figure 8 in the appendix.
It is important to note that the extrinsic reward is only used for evaluation, not for training.
However, this is just a proxy for pure exploration because the game rewards could be arbitrary and might not align at all with how the agent explores out of curiosity.

The first thing to notice from the curves is: most of them are going up.
This shows that a pure curiosity-driven agent can learn to obtain external rewards even without using any extrinsic rewards during training.
It is remarkable that agents with no extrinsic reward and no end of episode signal can learn to get scores comparable in some cases to learning with the extrinsic reward.
For instance, in Breakout, the game score increases on hitting the ball with the paddle into bricks which disappear and give points when struck.
The more times the bricks are struck in a row by the ball, the more complicated the pattern of bricks remaining becomes, making the agent more curious to explore further, hence, collecting points as a bi-product.
Further, when the agent runs out of lives, the bricks are reset to a uniform structure again that has been seen by the agent many times before and is hence very predictable, so the agent tries to stay alive to be curious by avoiding reset by death.

505 :YAMAGUTIseisei:2018/08/26(日) 16:54:32.44 ID:CL1hr8qnX ?2BP(3)
This is an unexpected result and might suggest that many popular RL test-beds do not need an external reward.
This may be because game designers (similar to architects, urban planners, gardeners, etc.) are
very good at setting up curriculums to guide agents through the task explaining the reason Curiosity-like objective decently aligns with the extrinsic reward in many human-designed environments [6, 12, 16, 48].
However, this is not always the case, and sometimes a curious agent can even do worse than random agent!
This happens when the extrinsic reward has little correlation with the agent 's exploration, or when the agent fails to explore efficiently (e.g. see games ` Atlantis ' , ` IceHockey ' in Figure 8).
We further encourage the reader to refer to the game-play videos of the agent available on the website for a better understanding of the learned skills.

Comparison of feature learning methods:
We compare four feature learning methods in Figure 2: raw pixels, random features, inverse dynamics features and VAE features.
Training dynamics on raw-pixels performs bad across all the environments, while encoding pixels into features does better.
This is likely because it is hard to learn a good dynamics model in pixel space, and prediction errors may be dominated by small irrelevant details.

Surprisingly, random features (RF) perform quite well across tasks and sometimes better than using learned features.
One reason for good performance is that the random features are kept frozen (stable), the dynamics model learned on top of them has an easier time because of the stationarity of the target.
In general, random features should work well in the domains where visual observations are simple enough, and random features can preserve enough information about the raw signal, for instance, Atari games.
Interestingly, we find that while random features work well at training, IDF learned features appear to generalize better in Mario Bros. (see Section 3.2 for details).

506 :YAMAGUTIseisei:2018/08/26(日) 16:55:34.83 ID:CL1hr8qnX ?2BP(3)
The VAE method also performed well but was somewhat unstable, so we decided to use RF and IDF for further experiments.
The detailed result in appendix Figure 8 compares IDF vs.
RF across the full Atari suite.
To quantify the learned behaviors, we compared our curious agents to a randomly acting agent.
We found that an IDF-curious agent collects more game reward than a random agent in 75% of the Atari games, an RF-curious agent does better in 70%.
Further, IDF does better than RF in 55% of the games.
Overall, random features and inverse dynamics features worked well in general.
Further details in the appendix.

B) Super Mario Bros.
We compare different feature learning methods in Mario Bros. in Figure 2.
Super Mario Bros has already been studied in the context of extrinsic reward free learning [27] in small-scale experiments, and so we were keen to see how far curiosity alone can push the agent.
We use an efficient version of Mario simulator faster to scale up for longer training keeping observation space, actions, dynamics of the game intact.
Due to 100x longer training and using PPO for optimization, our agent is able to pass several levels of the game, significantly improving over prior exploration results on Mario Bros.
Could we further push the performance of a purely curious agent by making the underlying optimization more stable? One way is to scale up the batch-size.
We do so by increasing the number of parallel threads for running environments from 128 to 2048.

6

507 :YAMAGUTIseisei:2018/08/26(日) 16:58:13.95 ID:CL1hr8qnX ?2BP(3)
Page 7


0 10 20 30
0 250 500 750 1000 1250 1500 1750 2000

Extrinsic Reward per Episode

Number of gradient updates
(a) Mario w/ large batch
Batch of 128 environments
Batch of 1024 environments

Frames (millions)
(b) Juggling (Roboschool)
Pure curiosity (no-reward, infenite-horizon) exploration
juggling (Roboschool)

Frames (millions)
(c) Two-player Pong
Pure curiosity (no-reward, infenite-horizon) exploration
Two player Pong

Figure 3:
(a) Left: A comparison of the RF method on Mario with different batch sizes.
Results are without using extrinsic reward.
(b) Center: Number of ball bounces in the Juggling (Roboschool) environment.
(c) Right: Mean episode length in the multiplayer Pong environment.
The discontinuous jump on the graph corresponds to the agent reaching a limit of the environment -
after a certain number of steps in the environment the Atari Pong emulator starts randomly cycling through background colors and becomes unresponsive to agent 's actions

508 :YAMAGUTIseisei:2018/08/26(日) 16:59:09.88 ID:CL1hr8qnX ?2BP(3)
We show the comparison between training using 128 and 2048 parallel environment threads in Figure 3(a).
As apparent from the graph, training with large batch-size using 2048 parallel environment threads performs much better.
In fact, the agent is able to explore much more of the game: discovering 11 different levels of the game, finding secret rooms and defeating bosses.
Note that the x-axis in the figure is the number of gradient steps, not the number of frames, since the point of this large-scale experiment is not a claim about sample-efficiency, but performance with respect to training the agent.
This result suggests that the performance of a purely curiosity-driven agent would improve as the training of base RL algorithm (which is PPO in our case) gets better.
The video is on the website.

C) Roboschool Juggling
We modified the Pong environment from the Roboschool framework to only have one paddle and to have two balls.
The action space is continuous with two-dimensions, and we discretized the action space into 5 bins per dimension giving a total of 25 actions.
Both the policy and embedding network are trained on pixel observation space (note: not state space).
This environment is more difficult to control than the toy physics used in games, but the agent learns to intercept and strike the balls when it comes into its area.
We monitored the number of bounces of the balls as a proxy for interaction with the environment, as shown in Figure 3(b).
See the video on the project website.

509 :YAMAGUTIseisei:2018/08/26(日) 16:59:48.41 ID:CL1hr8qnX ?2BP(3)
D) Roboschool Ant Robot
We also explored using the Ant environment which consists of an Ant with 8 controllable joints on a track.
We again discretized the action space and trained policy and embedding network on raw pixels (not state space).
However, in this case, it was less easy to measure exploration because the extrinsic distance reward measures progress along the racetrack, but a purely curious agent is free to move in any direction.
We find that a walking like behavior emerges purely out of a curiosity-driven training.
We refer the reader to the result video showing that the agent is meaningfully interacting with the environment.

E) Multi-agent curiosity in Two-player Pong
We have already seen that a purely curiosity-driven agent learns to play several Atari games without reward, but we wonder how much of that behavior is caused by the fact that the opposing player is a computer-agent with hardcoded strategy.
What would happen if we were to make both the teams playing against each other to be curious? To find out, we take Two-player Pong game where both the sides (paddles of pong) of the game are controlled by curiosity-driven agents.
We share the initial layers of both the agent and have different action heads, i.e., total action space is now the cross product of the actions of player 1 by the actions of player 2.

Note that the extrinsic reward is meaningless in this context since the agent is playing both sides, so instead, we show the length of the episode.
The results are shown in Figure 3(c).
We see from the episode length that the agent learns to have more and longer rallies over time, learning to play pong without any teacher ? purely by curiosity on both sides.
In fact, the game rallies eventually get so long that they break our Atari emulator causing the colors to change radically, which crashes the policy as shown in the plot.

7

510 :YAMAGUTIseisei:2018/08/26(日) 17:01:15.34 ID:CL1hr8qnX ?2BP(3)
Page 8

3.2
Generalization across novel levels in Super Mario Bros.
In the previous section, we showed that our purely curious agent can learn to explore efficiently and learn useful skills, e.g., game playing behaviour in games, walking behaviour in Ant etc.
So far, these skills were shown in the environment where the agent was trained on.
However, one advantage of developing reward-free learning is that one should then be able to utilize abundant `` unlabeled '' environments without reward functions by showing generalization to novel environments.

To test this, we first pre-train our agent using curiosity only in the Level 1-1 of Mario Bros.
We investigate how well RF and IDF-based curiosity agents generalize to novel levels of Mario.
In Figure 4, we show two examples of training on one level of Mario and finetuning on another testing level, and compare to learning on the testing level from scratch.
The training signal in all the cases is only curiosity reward.
In the first case, from Level 1-1 to Level 1-2, the global statistics of the environments match (both are ` day ' environment in games, i.e., blue background) but levels have different enemies, geometry and difficulty level.
We see that there is strong transfer from for both methods in this scenario.
However, the transfer performance is weaker in the second scenario from Level 1-1 to Level 1-3.
This is so because the problem is considerably harder for the latter level pairing as there is a color scheme shift from day to night, as shown in Figure 4.

511 :YAMAGUTIseisei:2018/08/26(日) 17:02:53.25 ID:CL1hr8qnX ?2BP(3)
We further note that IDF-learned features transfer in both the cases and random features transfer in the first case, but do not transfer in the second scenario from day to night.
These results might suggest that while random features perform well on training environments, learned features appear to generalize better to novel levels.
However, this needs more analysis in the future across a large variety of environments.
Overall, we find some promising evidence showing that skills learned by curiosity help our agent explore efficiently in novel environments.


IDF scratch
IDF transfer
RF scratch
RF transfer

0 10 20 30
0 250 500 750 1000 1250 1500 1750 2000

World 1 level 1 to world 2 level 1
0 10 20 30
0 250 500 750 1000 1250 1500 1750 2000
World 1 level 1 to world 3 level 1
Frames (millions)
Extrinsic Reward per Episode

Figure 4:
Mario generalization experiments.
On the left we show transfer results from Level 1-1 to Level 1-2, and on the right we show transfer results from Level 1-1 to Level 1-3.
Underneath each plot is a map of the source and target environments.
All agents are trained without extrinsic reward.

512 :YAMAGUTIseisei:2018/08/26(日) 17:04:02.97 ID:CL1hr8qnX ?2BP(3)
Frames (millions)
Extrinsic Reward per Episode

Unity maze

Random CNN features
Extrinsic only
Inverse dynamics features

Figure 5: Mean extrinsic reward in the Unity environment while training with terminal extrinsic + curiosity reward.
Note that the curve for extrinsic reward only training is constantly zero.


3.3 Curiosity with Sparse External Reward
In all our experiments so far, we have shown that our agents can learn useful skills without any extrinsic rewards driven purely by curiosity.
However, in many scenarios, we might want the agent to perform some particular task of interest.
This is usually conveyed to the agent by defining extrinsic rewards.
When rewards are dense (e.g. game score at every frame), classic RL works well and intrinsic rewards generally should not help performance.
However, designing dense rewards is a challenging engineering problem (see introduction for details).
In this section, we evaluate how well curiosity can help an agent perform a task in presence of sparse, or just terminal, rewards.

Terminal reward setting:
For many real problems, e.g. navigation, the only terminal reward is available, a setting where classic RL typically performs poorly.
Hence, we consider the 3D navigation in a maze designed in the Unity ML-agent framework with 9 rooms and a sparse terminal reward.

8

513 :YAMAGUTIseisei:2018/08/26(日) 17:04:57.59 ID:CL1hr8qnX ?2BP(3)
Page 9

There is a discrete action space consisting of: move forwards, look left 15 degrees, look right 15 degrees and no-op.
The agent starts in the room-1, which is furthest away from room-9 which contains the goal of the agent.
We compare an agent trained with extrinsic reward (+1 when the goal is reached, 0 otherwise) to an agent trained with extrinsic + intrinsic reward.
Extrinsic only (classic RL) never finds the goal in all our trials which means it is impossible to get any meaningful gradients.
Whereas extrinsic+intrinsic typically converges to getting the reward every time.
Results in Figure 5 show results for vanilla PPO, PPO + IDF-curiosity and PPO + RF-curiosity.

Sparse reward setting: In preliminary experiments, we picked 5 Atari games which have sparse rewards (as categorized by [3]), and compared extrinsic (classic RL) vs.
extrinsic+intrinsic (ours) reward performance.
In 4 games out of 5, curiosity bonus improves performance (see Table 2 in the appendix, the higher score is better).
We would like to emphasize that this is not the focus of the paper, and these experiments are provided just for completeness.
We just combined extrinsic (coefficient 1.0) and intrinsic reward (coefficient 0.01) directly without any tuning.
We leave the question on how to optimally combine extrinsic and intrinsic rewards as a future direction.

514 :YAMAGUTIseisei:2018/08/26(日) 17:11:36.23 ID:CL1hr8qnX ?2BP(3)
4
Related Work

Intrinsic Motivation:
A family of approaches to intrinsic motivation reward
an agent based on prediction error [2, 27, 36, 42], prediction uncertainty [11, 44], or improvement [19, 34] of a forward dynamics model of the environment that gets trained along with the agent 's policy.
As a result the agent is driven to reach regions of the environment that are difficult to predict for the forward dynamics model, while the model improves its predictions in these regions.
This adversarial and non-stationary dynamics can give rise to complex behaviors.
Relatively little work has been done in this area on the pure exploration setting where there is no external reward.
Of these mostly closely related are those that use a forward dynamics model of a feature space such as Stadie et al. [42] where they use autoencoder features, and Pathak et al. [27] where they use features trained
with an inverse dynamics task.
These correspond roughly to the VAE and IDF methods detailed in Section 2.1.

Smoothed versions of state visitation counts can be used for intrinsic rewards [3, 9, 24, 47].
Count-based methods have already shown very strong results when combining with extrinsic rewards such as setting the state of the art in the Atari game Montezuma 's Revenge [3],
and also showing significant exploration of the game without using the extrinsic reward.
It is not yet clear in which situations count-based approaches should be preferred over dynamics-based approaches; we chose to focus on dynamics-based bonuses in this paper since we found them straightforward to scale and parallelize.
In our preliminary experiments, we did not have sufficient success with already existing count-based implementations in scaling up for a large-scale study.

515 :YAMAGUTIseisei:2018/08/26(日) 17:16:26.46 ID:CL1hr8qnX ?2BP(3)
Learning without extrinsic rewards or fitness functions has also been studied extensively in the evolutionary computing where it is referred to as ` novelty search ' [17, 18, 43].
There the novelty of an event is often defined as the distance of the event to the nearest neighbor amongst previous events, using some statistics of the event to compute distances.
One interesting finding from this literature is that often much more interesting solutions can be found by not solely optimizing for fitness.

Other methods of exploration are designed to work in combination with maximizing a reward function, such as those utilizing uncertainty about value function estimates [5, 23], or those using perturbations of the policy for exploration [8, 29].
Schmidhuber [37] and Oudeyer [25], Oudeyer and Kaplan [26] provide a great review of some of the earlier work on approaches to intrinsic motivation.
Alternative methods of exploration include Sukhbaatar et al. [45] where they utilize an adversarial game between two agents for exploration.
In Gregor et al. [10], they optimize a quantity called empowerment which is a measurement of the control an agent has over the state.
In a concurrent work, diversity is used as a measure to learn skills without reward functions Eysenbach et al. [7].

Random Features:
One of the findings in this paper is the surprising effectiveness of random features, and there is a substantial literature on random projections and more generally randomly initialized neural networks.
Much of the literature has focused on using random features for classification [14, 33, 49] where the typical finding is that whilst random features can work well for simpler problems,
feature learning performs much better once the problem becomes sufficiently complex.
Whilst we expect this pattern to also hold true for dynamics-based exploration, we have some preliminary evidence showing that learned features appear to generalize better to novel levels in Mario Bros.

9

516 :YAMAGUTIseisei:2018/08/26(日) 17:18:41.35 ID:CL1hr8qnX ?2BP(3)
Page 10

5
Discussion

We have shown that our agents trained purely with a curiosity reward are able to learn useful behaviours: (a) Agent being able to play many atari games without using any rewards.
(b) Mario being able to cross over over 11 levels without reward.
(c) Walking like behavior emerged in the Ant environment.
(d) Juggling like behavior in Robo-school environment (e) Rally-making behavior in Two-player Pong with curiosity-driven agent on both sides.
But this is not always true as there are some Atari games where exploring the environment does not correspond to extrinsic reward.

More generally, these results suggest that, in environments designed by humans, the extrinsic reward is perhaps often aligned with the objective of seeking novelty.
The game designers set up curriculums to guide users while playing the game explaining the reason Curiosity-like objective decently aligns with the extrinsic reward in many human-designed games [6, 12, 16, 48].



0.0 0.2 0.4 0.6 0.8 1.0
0 1 2 3 4 5 6 7 8

Frames (millions)
Extrinsic Reward per Episode

RF with TV off
RF with TV on
IDF with TV off
IDF with TV on

Figure 6:
We add a noisy TV to the unity environment in Section 3.3.
We compare IDF and RF with and without the TV.

517 :YAMAGUTIseisei:2018/08/26(日) 17:19:53.47 ID:CL1hr8qnX ?2BP(3)
Limitation of prediction error based curiosity:
A more serious potential limitation is the handling of stochastic dynamics.
If the transitions in the environment are random, then even with a perfect dynamics model, the expected reward will be the entropy of the transition, and the agent will seek out transitions with the highest entropy.
Even if the environment is not truly random, unpredictability caused by a poor learning algorithm, an impoverished model class or partial observability can lead to exactly the same problem.
We did not observe this effect in our experiments on games so we designed an environment to illustrate the point.

We return to the maze of Section 3.3 to empirically validate a common thought experiment called the noisy-TV problem.
The idea is that local sources of entropy in an environment like a TV that randomly changes channels when an action is taken should prove to be an irresistible attraction to our agent.
We take this thought experiment literally and add a TV to the maze along with an action to change the channel.
In Figure 6 we show how adding the noisy-TV affects the performance of IDF and RF.
As expected the presence of the TV drastically slows down learning, but we note that if you run the experiment for long enough the agents do sometimes converge to getting the extrinsic reward consistently.
We have shown empirically that stochasticity can be a problem, and so it is important for future work to address this issue in an efficient manner.

518 :YAMAGUTIseisei:2018/08/26(日) 17:22:09.20 ID:CL1hr8qnX ?2BP(3)
Future Work:
We have presented a simple and scalable approach that can learn nontrivial behaviors across a diverse range of environments without any reward function or end-of-episode signal.
One surprising finding of this paper is that random features perform quite, but learned features appear to generalize better.
Whilst we believe that learning features will become important once the environment is complex enough, we leave that up to future work to explore.
Our wider goal, however, is to show that we can take advantage of many unlabeled (i.e., not having an engineered reward function) environments to improve performance on a task of interest.
Given this goal, showing performance in environments with a generic reward function is just the first step, and future work could investigate transfer from unlabeled to labeled environments.

Acknowledgments

We would like to thank Chris Lu for helping with the Unity environment, Phillip Isola and Alex Nichols for feedback on the paper.
We are grateful to the members of BAIR and OpenAI for fruitful discussions.
DP is supported by the Facebook graduate fellowship.

References

[1] Unity ML-agents.
http://github.com/Unity-Technologies/ml-agents
.
2

10


Page 11


[2] J. Achiam and S. Sastry. Surprise-based intrinsic motivation for deep reinforcement learning.
arXiv:1703.01732, 2017. 3, 9
[3] M. Bellemare, S. Srinivasan, G. Ostrovski, T. Schaul, D. Saxton, and R. Munos.
Unifying count-based exploration and intrinsic motivation. In NIPS, 2016. 1, 9
[4] M. G. Bellemare, Y. Naddaf, J. Veness, and M. Bowling.
The arcade learning environment: An evaluation platform for general agents. Journal of Artificial Intelligence Research, 47:253279, jun 2013. 2

519 :YAMAGUTIseisei:2018/08/26(日) 17:23:49.66 ID:CL1hr8qnX ?2BP(3)
[5] R. Y. Chen, J. Schulman, P. Abbeel, and S. Sidor.
UCB and infogain exploration via q-ensembles.arXiv:1706.01502, 2017. 9
[6] G. Costikyan. Uncertainty in games. Mit Press, 2013. 6, 10
[7] B. Eysenbach, A. Gupta, J. Ibarz, and S. Levine.
Diversity is all you need: Learning skills without a reward function. arXiv preprint, 2018. 9
[8] M. Fortunato, M. G. Azar, B. Piot, J. Menick, I. Osband, A. Graves, V. Mnih, R. Munos, D. Hassabis, O. Pietquin, C. Blundell, and S. Legg.
Noisy networks for exploration. arXiv:1706.10295, 2017. 9
[9] J. Fu, J. D. Co-Reyes, and S. Levine.
EX2: Exploration with exemplar models for deep reinforcement learning. NIPS, 2017. 9
[10] K. Gregor, D. J. Rezende, and D. Wierstra.
Variational intrinsic control. ICLR Workshop, 2017. 9
[11] R. Houthooft, X. Chen, Y. Duan, J. Schulman, F. De Turck, and P. Abbeel.
Vime: Variational information maximizing exploration. In NIPS, 2016. 1, 9
[12] R. Hunicke, M. LeBlanc, and R. Zubek.
Mda: A formal approach to game design and game research. In AAAI Workshop on Challenges in Game AI, 2004. 6, 10
[13] S. Ioffe and C. Szegedy.
Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv preprint arXiv:1502.03167, 2015. 4
[14] K. Jarrett, K. Kavukcuoglu, Y. LeCun, et al.
What is the best multi-stage architecture for object recognition?
In Computer Vision, 2009 IEEE 12th International Conference on, pages 21462153. IEEE, 2009. 9
[15] D. P. Kingma and M. Welling.
Auto-encoding variational Bayes. arXiv preprint arXiv:1312.6114, 2013. 2, 3
[16] N. Lazzaro. Why we play games:
Four keys to more emotion in player experiences. In Proceedings of GDC, 2004. 6, 10
[17] J. Lehman and K. O. Stanley.
Exploiting open-endedness to solve problems through the search for novelty. In ALIFE, 2008. 9
[18] J. Lehman and K. O. Stanley.
Abandoning objectives: Evolution through the search for novelty alone. Evolutionary computation, 2011. 9

520 :YAMAGUTIseisei:2018/08/26(日) 17:25:07.68 ID:CL1hr8qnX ?2BP(3)
[19] M. Lopes, T. Lang, M. Toussaint, and P.-Y. Oudeyer.
Exploration in model-based reinforcement learning by empirically estimating learning progress. In NIPS, 2012. 9
[20] M. Lopes, T. Lang, M. Toussaint, and P.-Y. Oudeyer.
Exploration in model-based reinforcement learning by empirically estimating learning progress. In NIPS, 2012. 1
[21] V. Mnih, K. Kavukcuoglu, D. Silver, A. A. Rusu, J. Veness, M. G. Bellemare, A. Graves, M. Riedmiller, A. K. Fidjeland, G. Ostrovski, et al.
Human-level control through deep reinforcement learning. Nature, 2015. 1
[22] S. Mohamed and D. J. Rezende.
Variational information maximisation for intrinsically motivated reinforcement learning. In NIPS, 2015. 1
[23] I. Osband, C. Blundell, A. Pritzel, and B. Van Roy.
Deep exploration via bootstrapped dqn. In NIPS, 2016. 9
[24] G. Ostrovski, M. G. Bellemare, A. v. d. Oord, and R. Munos.
Count-based exploration with neural density models. arXiv:1703.01310, 2017. 1, 9
[25] P.-Y. Oudeyer.
Computational theories of curiosity-driven learning. arXiv preprint arXiv:1802.10546, 2018. 9

11


Page 12

[26] P.-Y. Oudeyer and F. Kaplan.
What is intrinsic motivation? a typology of computational approaches. Frontiers in neurorobotics, 2009. 1, 9
[27] D. Pathak, P. Agrawal, A. A. Efros, and T. Darrell.
Curiosity-driven exploration by self-supervised prediction. In ICML, 2017. 1, 2, 3, 4, 6, 9
[28] D. Pathak, P. Mahmoudieh, G. Luo, P. Agrawal, D. Chen, Y. Shentu, E. Shelhamer, J. Malik, A. A. Efros, and T. Darrell.
Zero-shot visual imitation. In ICLR, 2018. 1
[29] M. Plappert, R. Houthooft, P. Dhariwal, S. Sidor, R. Y. Chen, X. Chen, T. Asfour, P. Abbeel, and M. Andrychowicz.
Parameter space noise for exploration. arXiv:1706.01905, 2017. 9
[30] P. Poupart, N. Vlassis, J. Hoey, and K. Regan.
An analytic solution to discrete bayesian reinforcement learning. In ICML, 2006. 1

521 :YAMAGUTIseisei:2018/08/26(日) 17:27:09.49 ID:CL1hr8qnX ?2BP(3)
[31] D. J. Rezende, S. Mohamed, and D. Wierstra.
Stochastic backpropagation and approximate inference in deep generative models. arXiv preprint arXiv:1401.4082, 2014. 3
[32] E. L. Ryan, Richard; Deci.
Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemporary Educational Psychology, 2000. 1
[33] A. M. Saxe, P. W. Koh, Z. Chen, M. Bhand, B. Suresh, and A. Y. Ng.
On random weights and unsupervised feature learning. In ICML, pages 10891096, 2011. 9
[34] J. Schmidhuber.
Curious model-building control systems. In Neural Networks, 1991. 1991 IEEE International Joint Conference on, pages 14581463. IEEE, 1991. 9
[35] J. Schmidhuber.
A possibility for implementing curiosity and boredom in model-building neural controllers.
In From animals to animats: Proceedings of the first international conference on simulation of adaptive behavior, 1991. 1
[36] J. Schmidhuber.
A possibility for implementing curiosity and boredom in model-building neural controllers, 1991. 9
[37] J. Schmidhuber.
Formal theory of creativity, fun, and intrinsic motivation (19902010). IEEE Transactions on Autonomous Mental Development, 2010. 9
[38] J. Schulman, F. Wolski, P. Dhariwal, A. Radford, and O. Klimov.
Proximal policy optimization algorithms. arXiv preprint arXiv:1707.06347, 2017. 4
[39] J. Schulman, F. Wolski, P. Dhariwal, A. Radford, and O. Klimov.
Proximal policy optimization algorithms. arXiv preprint arXiv:1707.06347, 2017. 2
[40] S. P. Singh, A. G. Barto, and N. Chentanez.
Intrinsically motivated reinforcement learning. In NIPS, 2005. 1
[41] L. Smith and M. Gasser.
The development of embodied cognition: Six lessons from babies. Artificial life, 2005. 1
[42] B. C. Stadie, S. Levine, and P. Abbeel.
Incentivizing exploration in reinforcement learning with deep predictive models. NIPS Workshop, 2015. 2, 9
[43] K. O. Stanley and J. Lehman.
Why greatness cannot be planned: The myth of the objective. Springer, 2015. 9

522 :YAMAGUTIseisei:2018/08/26(日) 17:28:31.63 ID:CL1hr8qnX ?2BP(3)
[44] S. Still and D. Precup.
An information-theoretic approach to curiosity-driven reinforcement learning. Theory in Biosciences, 2012. 9
[45] S. Sukhbaatar, I. Kostrikov, A. Szlam, and R. Fergus.
Intrinsic motivation and automatic curricula via asymmetric self-play. In ICLR, 2018. 9
[46] R. S. Sutton and A. G. Barto.
Reinforcement learning: An introduction. MIT press Cambridge, 1998. 4
[47] H. Tang, R. Houthooft, D. Foote, A. Stooke, X. Chen, Y. Duan, J. Schulman, F. De Turck, and P. Abbeel.
#Exploration: A study of count-based exploration for deep reinforcement learning. Advances in Neural Information Processing Systems, 2017. 9
[48] P. Wouters, H. Van Oostendorp, R. Boonekamp, and E. Van der Spek.
The role of game discourse analysis and curiosity in creating engaging and effective serious games by implementing a back story and foreshadowing. Interacting with Computers, 2011. 6, 10
[49] Z. Yang, M. Moczulski, M. Denil, N. de Freitas, A. Smola, L. Song, and Z. Wang.
Deep fried convnets. In Proceedings of the IEEE International Conference on Computer Vision, pages 14761483, 2015. 9

12


Page 13

A
Implementation Details

We have released the training code and environments on our website 3.
For full details, we refer the reader to our code and video results in the website.

Pre-processing:
All experiments were done with pixels.
We converted all images to grayscale and resized to size 84x84.
We learn the agent 's policy and forward dynamics function both on a stack of historical observations [xt?3, xt?2, xt?1, xt] instead of only using the current observation.
This is to capture partial observability in these games.
In the case of Super Mario Bros and Atari experiments, we also used a standard frameskip wrapper that repeats each action 4 times.

523 :YAMAGUTIseisei:2018/08/26(日) 17:29:10.00 ID:CL1hr8qnX ?2BP(3)
Architectures:
Our embedding network and policy networks had identical architectures and were based on the standard convolutional networks used in Atari experiments.
The layer we take as features in the embedding network had dimension 512 in all experiments and no nonlinearity.
To keep the scale of the prediction error consistent relative to extrinsic reward, in the Unity experiments we applied batchnorm to the embedding network.
We also did this for the Mario generalization experiments to reduce covariate shift from level to level.
For the VAE auxiliary task and pixel method, we used a similar deconvolutional architecture the exact details of which can be found in our code submission.
The IDF and forward dynamics networks were heads on top of the embedding network with several extra fully-connected layers of dimensionality 512.

Hyper-parameters:
We used a learning rate of 0.0001 for all networks.
In most experiments, we used 128 parallel environments with the exceptions of the Unity and Roboschool experiments where we could only run 32 parallel environments, and the large scale Mario experiment where we used 2048.
We used rollouts of length 128 in all experiments except for the Unity experiments where we used 512 length rollouts so that the network could quickly latch onto the sparse reward.
In the initial 9 experiments on Mario and Atari, we used 3 optimization epochs per rollout in the interest of speed.
In the Mario scaling, generalization experiments, as well as the Roboschool experiments, we used 6 epochs.
In the Unity experiments, we used 8 epochs, again to more quickly take advantage of sparse rewards.

B
Additional Results

524 :YAMAGUTIseisei:2018/08/26(日) 17:29:43.70 ID:CL1hr8qnX ?2BP(3)
0 250 500 750 1000 1250 1500 1750
0 1000 2000 3000 4000 5000 6000 7000

0 100 200 300 400

BeamRider
BreakOut
MontezumaRevenge
Pong
Mario
Qbert
Reverraid
Seaquest
SpaceInvaders

Frames (millions)
Extrinsic Reward per Episode

Pixels
VAE features
Inverse Dynamics features
Random CNN features

(a) Best returns
(b) Episode length

Figure 7:
(a) Left: Best extrinsic returns on eight Atari games and Mario.
(c) Right: Mean episode lengths on eight Atari games and Mario.

525 :YAMAGUTIseisei:2018/08/26(日) 17:32:07.16 ID:CL1hr8qnX ?2BP(3)
Frames (millions)
Extrinsic Reward per Episode

Inverse Dynamics features
Random agent
Random CNN features

Figure 8:
Pure curiosity-driven exploration (no extrinsic reward, or end-of-episode signal) on 48 Atari games.
We observe that the extrinsic returns of curiosity-driven agents often increases despite the agents having no access to the extrinsic return or end of episode signal.
In multiple environments,
the performance of the curiosity-driven agents is significantly better than that of a random agent, although there are environments where the behavior of the agent is close to random, or in fact seems to minimize the return, rather than maximize it.
For the majority of the training process RF perform better than a random agent in about 67% of the environments, while IDF perform better than a random agent in about 71% of the environments.


Reward Gravitar Freeway Venture PrivateEye MontezumaRevenge
Ext Only 999.3 ± 220.7 33.3 ± 0.6 0 ± 0 5020.3 ± 395 1783 ± 691.7
Ext + Int 1165.1 ± 53.6 32.8 ± 0.3 416 ± 416 3036.5 ± 952.1 2504.6 ± 4.6

Table 2:
These results compare the mean reward (± std-error) after 100 million frames across 3 seeds for an agent trained with intrinsic plus extrinsic reward versus extrinsic reward only.
The extrinsic (coefficient 1.0) and intrinsic reward (coefficient 0.01) were directly combined without any hyper-parameter tuning.
We leave the question on how to optimally combine extrinsic and intrinsic rewards up to future work.
This is to emphasize that combining extrinsic with intrinsic rewards is not the focus of the paper, and these experiments are provided just for completeness.

526 :YAMAGUTIseisei:2018/08/26(日) 17:32:51.06 ID:CL1hr8qnX ?2BP(3)
B.1
Atari
To better measure the amount of exploration, we provide the best return of curiosity-driven agents in figure 7(a) and the episode lengths in figure 7(b).
Notably on Pong the increasing episode length combined with a plateau in returns shows that the agent maximizes the number of ball bounces, rather than the reward.

Figure 8 shows the performance of curiosity-driven agents based on Inverse Dynamics and Random features on 48 Atari games.

Although not the focus of this paper, for completeness we include some results on combining intrinsic and extrinsic reward on several sparse reward Atari games.
When combining with extrinsic rewards, we use the end of the episode signal.
The reward used is the extrinsic reward plus 0.01 times the intrinsic reward.
The results are shown in Table 2.
We don't observe a large difference between the settings, likely because the combination of intrinsic and extrinsic reward needs to be tuned.
We did observe that one of the intrinsic+extrinsic runs on Montezuma 's Revenge explored 10 rooms.

3Website at http://pathak22.github.io/large-scale-curiosity/

13

527 :YAMAGUTIseisei:2018/08/26(日) 17:33:32.21 ID:CL1hr8qnX ?2BP(3)
Page 14


0 2500 5000 7500 10000 12500 15000 17500
0 25000 50000 75000 100000 125000 150000 175000 200000

Extrinsic Reward per Episode
Number of gradient updates

Scale in Mario

Batch of 128 environments
Batch of 1024 environments

Figure 9:
Best extrinsic returns on the Mario scaling experiments.
We observe that larger batches allow the agent to explore more effectively, reaching the same performance in less parameter updates, and also achieving better ultimate scores.


B.2 Mario We show the analogue of the plot shown in Figure 3(a) showing max extrinsic returns.
See Figure 9
.
14


Page 15


15

528 :YAMAGUTIseisei:2018/08/26(日) 17:36:22.93 ID:CL1hr8qnX ?2BP(3)
>>488-

>>514
A family of approaches to intrinsic motivation reward an agent based on prediction error , prediction uncertainty , or improvement of a forward dynamics model of the environment that gets trained along with the agent 's policy.

>>515
literature has focused on using random features for classification where the typical finding is that whilst random features can work well for simpler problems, feature learning performs much better once the problem becomes sufficiently complex.

529 :YAMAGUTIseisei:2018/09/09(日) 07:47:01.01 ID:vMpjqLBja ?2BP(3)
This is the html version of the file http://mazonka.com/st/lcss.pdf
. Google automatically generates html versions of documents as we crawl the web.



Page 1


A Simple Multi-Processor Computer Based on Subleq


Oleg Mazonka and Alex Kolodin
mazonka@gmail.com alex.kolodin@gmail.com

May 2011 (Revision 3, draft 8)



Abstract

Subleq (Subtract and Branch on result Less than or Equal to zero) is both an instruction set and a programming language for a One Instruction Set Computer (OISC).
We describe a hardware implementation of an array of 28 one-instruction Subleq processors on a low-cost FPGA board.
Our test results demonstrate that the computational power of our Subleq OISC multi-processor is comparable to that of a CPU of a modern personal computer.
Additionally, we provide implementation details of our complier from a C-style language to Subleq.

530 :YAMAGUTIseisei:2018/09/09(日) 07:48:08.65 ID:vMpjqLBja ?2BP(3)
Contents

1. Introduction.  .  .  .  .  .  .  .  .  .  .  .  .  2
2. Subleq Assembly Language.  .  .  .  .  .  .  .  .  .......3
3. Hardware design .  .  .  .  .  .  .  .  .  .  .  .  6
3.1 Overview.  .  .  .  .  .  .  .  .  .  .  .  .  ..6
3.2 Interface description.  .  .  .  .  .  .  .  .  .  .  .7
3.3 Subleq processor .  .  .  .  .  .  .  .  .  .  .  ......7
4. C Compiler for Subleq.  .  .  .  .  .  .  .  .  .  .......8
4.1 Stack.  .  .  .  .  .  .  .  .  .  .  .  .  .  .8
4.2 Expressions .  .  .  .  .  .  .  .  .  .  .  .  ....10
4.3 Function calls .  .  .  .  .  .  .  .  .  .  .  .  .11
4.4 Stack variables .  .  .  .  .  .  .  .  .  .  .  .......14
4.5 Multiplication.  .  .  .  .  .  .  .  .  .  .  .  .15
4.6 Conditional jump .  .  .  .  .  .  .  .  .  .  .  ...16
5. Results.  .  .  .  .  .  .  .  .  .  .  .  .  ......17
5.1 Test #1.  .  .  .  .  .  .  .  .  .  .  .  .  ....17
5.2 Test #2.  .  .  .  .  .  .  .  .  .  .  .  .  ....18
6. Conclusion .  .  .  .  .  .  .  .  .  .  .  .  ....... 19
7. Appendix.  .  .  .  .  .  .  .  .  .  .  .  .  ..21
7.1 C with multiplication .  .  .  .  .  .  .  .  .  .  .....21
7.2 C without multiplication .  .  .  .  .  .  .  .  .  .  21
7.3 Subleq code.  .  .  .  .  .  .  .  .  .  .  .  ....22
References.  .  .  .  .  .  .  .  .  .  .  .  .  .  23


~ 1 ~

Page 2

531 :YAMAGUTIseisei:2018/09/09(日) 07:51:37.91 ID:vMpjqLBja ?2BP(3)
1. Introduction

OISC (One Instruction Set Computer) is the ultimate RISC (Reduced Instruction Set Computer) with conventional CPU, where the number of instructions is reduced to one.
Having only one available processor instruction eliminates the necessity of op-code and permits simpler computational elements, thus allowing more of them implemented in hardware with the same number of logical gates.
Since our goal was to build a functional multi-processor system with a maximum possible number of processors on a single low-cost programmable chip,
OISC was a natural choice, with the remaining step being selection of a suitable single-processor instruction set.

Currently known OISC can be roughly separated into three broad categories:

1. Transport Triggered Architecture Machines;
2. Bit Manipulating Machines;
3. Arithmetic Based Turing-Complete Machines.

Transport Triggered Architecture (TTA) is a design in which computation is a side effect of data transport.
Usually some memory registers (triggering ports) within common address space, perform an assigned operation when the instruction references them.
For example, in an OISC utilizing a single memory-to-memory copy instruction [1], this is done by triggering ports performing arithmetic and instruction pointer jumps when writing into them.
Despite appealing simplicity, there are two major drawbacks associated with such OISC.
Firstly, the CPU has to have separate functional units controlling triggering ports.
Secondly, there are difficulties with generalization of the design, since any two different hardware designs are likely to use two different assembly languages.
Because of these disadvantages we ruled out this class of OISCs for our implementation.

532 :YAMAGUTIseisei:2018/09/09(日) 07:52:35.82 ID:vMpjqLBja ?2BP(3)
Bit Manipulating Machines is the simplest class.
A bit copying machine, called BitBitJump, copies one bit in memory and passes the execution unconditionally to the address specified by one of the operands of the instruction [2].
This process turns out to be capable of universal computation (i.e.
being able to execute any algorithm and to interpret any other universal machine) because copying bits can conditionally modify the code ahead to be executed.
Another machine, called the Toga computer, inverts a bit and passes the execution conditionally depending on the result of inversion [3].
Yet another bit operating machine, similar to BitBitJump, copies several bits at the same time.
The problem of computational universality is solved in this case by keeping predefined jump tables in the memory [4].
Despite simplicity of the bit operating machines, we ruled them out too, because they require more memory than is normally available on an inexpensive FPGA.
To make a functional multiprocessor machine with bit manipulation operation, at least 1Mb of memory per processor is required.
Therefore we decided that a more complex processor with less memory is a better choice for our purposes.


~ 2 ~

Page 3


Arithmetic based Turing-complete Machines use an arithmetic operation and a conditional jump.
Unlike the two previous classes which are universal computers, this class is universal and Turing-complete in its abstract representation.
The instruction operates on integers which may also be addresses in memory.
Currently there are several known OISCs of this class, based on different arithmetic operations [5]:
addition ? Addleq, decrement ? DJN, increment ? P1eq, and subtraction ? Subleq (Subtract and Branch on result Less than or Equal to zero).
The latter is the oldest, the most popular and, arguably, the most efficient [6][7].
A Subleq instruction consists of three operands: two for subtraction and one for the conditional jump.

533 :YAMAGUTIseisei:2018/09/09(日) 07:53:50.33 ID:vMpjqLBja ?2BP(3)
Attempts to build hardware around Subleq have been undertaken previously.
For example, David A Roberts designed a Subleq CPU and wrote a software Subleq library [8].
His implementation is a single CPU with keyboard input, terminal, control ROM, and 16Mb RAM, and is much more complex than ours.
There were a few other similar designs described on various Internet sites, e.g. [9].
However all of them were just proof-of-concept simulations without a practical implementation.

In the following sections we describe components of the system we built.
Section 2 outlines a Subleq abstract machine and its assembly notation.
Section 3 describes our hardware implementation of the multiprocessor core.
Section 4 briefly describes techniques used to convert a high level programming language into Subleq instruction code.
In Sections 5 and 6 we present comparative speed-test results for our device, followed by a discussion and summary.
In the Appendix the code calculating factorials is presented in C and Subleq notations.


Subleq software <- USB -> Driver, Programming environment

Figure 1 FPGA board is connected to PC by USB cable


Figure 1 represents the connection of our device to a computer with USB cable.

534 :YAMAGUTIseisei:2018/09/09(日) 07:55:07.61 ID:vMpjqLBja ?2BP(3)
2. Subleq Assembly Language

A Subleq abstract machine operates on an infinite array of memory, where each cell holds an integer number.
This number can be an address of another memory cell.
The numbering starts from zero.
The program is defined as a sequence of instructions read from the memory with the first instruction at the address zero.
A Subleq instruction has 3 operands:

A B C

Execution of one instruction A B C subtracts the value in the memory cell at the address stored in A from the content of a memory cell at the address stored in B and then writes the result back into the cell with the address in B.
If the value after subtraction in B is less or equal to zero, the execution jumps to the address specified in C; otherwise execution continues to the next instruction, i.e. the address of the memory cell following C.


~ 3 ~

Page 4


Assembly notation helps to read and write code in Subleq.
The following is the list of syntax conventions:

? label;
? question mark;
? reduced instruction;
? multi-instruction;
? literal and expression;
? data section;
? comment.

535 :YAMAGUTIseisei:2018/09/09(日) 07:56:10.49 ID:vMpjqLBja ?2BP(3)
Label is a symbolic name of a particular address followed by a colon.
In the following example

 A B C
A:2 B:1 0
C:B B 0

each line represents one instruction with three operands.
Here A, B, and C are not abstract names, but labels (addresses) of specific cells in the memory.
For example, label A refers to the 4th cell, which is initialised with value 2.
The first instruction subtracts the value of cell A from the value of cell B, which value is 1, and stores the result in the cell B, which becomes -1.
Since the result is less than zero, the next instruction to be executed is the third line, because value C is the address of the first operand of the instruction on the third line.
That subtracts B from B making it zero, so the execution is passed to the address 0.
If these three lines is the whole program, then the first operand of the first instruction has the address 0.
In this case the execution is passed back to the first instruction which would make B -2.
That process continues forever.
The instructions being executed are only the first and the third lines, and the value of the cell B changes as 1, -1, 0, -2, 0, -2, 0, and so on.

A question mark is defined as the address of the next cell in memory:

A B ?
B B 0

is the same as

A B C
C:B B 0

Reduced instruction format is a convenient shortcut: two operands instead of three assume the third to be the address of next instruction, i.e. ?; and only one operand assumes the second to be the same as the first, so

536 :YAMAGUTIseisei:2018/09/09(日) 07:57:06.05 ID:vMpjqLBja ?2BP(3)
A

is the same as

A A


~ 4 ~

Page 5


and is the same as

A A ?

If more than one instruction is placed on the same line, each instruction except the last must be followed by semicolon.
The following code copies the value from

A to B: Z; B; A Z; Z B

Integer numbers like A:72 are used as constants in the code.
Literals can be used instead of integers assuming their ASCII values.
For example, A:'H' is the same as A:72; and A:"Hi" is the same as A:'H' 'i'.
Addition, subtraction, parenthesis, and unary minus can be used in expression.
The code Z Z ?+3 A B C D E F sets Z to zero and jumps to the third instruction D E F.
Since instructions can be reduced, the assembler must know when to generate a full instruction with three operands.
To avoid such generation, period at the beginning of the line is used.
Thus program data can be placed on such a line.
The code

537 :YAMAGUTIseisei:2018/09/09(日) 07:58:40.41 ID:vMpjqLBja ?2BP(3)
A A ?+1
. U:-1
U A

sets A to 1.
Without the period on the second line the code would be interpreted as

A A ?+1
U:-1 (-1) ?
U A

Comments are delimited by hash symbol #: everything from # till the end of the line is a comment.
Jump to a negative address halts the program.
Usually (-1) as the third operand is used to stop the program, for example:

# halt
Z Z (-1)

The parentheses around (-1) are necessary to indicate that it is the third operand, so the instruction would not be interpreted as

Z Z-1 ?

To make a Subleq program interactive (requesting data and responding to user while working), input and output operations can be defined as operations on a non-existing memory cell.
The same (-1) address can be used for this.
If the second operand is (-1) the value of the first operand is the output.
If the first operand is (-1), the second operand gets the value from the input stream.
Input and output operations are defined on byte basis in ASCII code.
If the program tries to output a value greater than 255, the behaviour is undefined.

Below is a "Hello world" program adapted from Lawrence Woodman helloworld.sq [10].

538 :YAMAGUTIseisei:2018/09/09(日) 07:59:47.32 ID:vMpjqLBja ?2BP(3)
~ 5 ~

Page 6


It is exceptionally terse, but is a good example of Subleq efficiency.

L:H (-1); U L; U ?+2; Z H (-1); Z Z L
. U:-1 H:"hello, world\n" Z:0

A special variable called Z is often used in Subleq as an intermediate temporary variable within a highly small scope.
It is commonly assumed that this variable is initialised at zero and left at zero after every usage.

The program above consists of five instructions.
The first instruction prints the character pointed by its first operand (the first pointer) which is initialised to the beginning of the data string ? the letter ’h’.
The second instruction increments that pointer ? the first operand of the first instruction.
The third instruction increments the second pointer, which is the second operand of the forth instruction.
The forth instruction tests the value pointed by the second pointer and halts the program when the value is zero.
It becomes zero when the pointer reaches the cell one after the end of the data string, which is Z:0.
The fifth instruction loops back to the beginning of the program, so the process continues until the halt condition is not satisfied.

539 :YAMAGUTIseisei:2018/09/09(日) 08:00:42.28 ID:vMpjqLBja ?2BP(3)
3. Hardware design

3.1 Overview

We have used Altera Cyclone III EP3C16 FPGA as the basis for a hardware implementation.
The choice was based on the relatively low price (~ US$30) of this FPGA IC chip and availability of the test hardware for it.

The test board we used has a DDR2 RAM IC fitted, but access to the RAM is limited to one process at a time.
True parallel implementation requires separate internal memory blocks allocated for each processor hence the amount of available memory in the FPGA limits the number of processors.
The EP3C16 FPGA has 56 of 16 bit memory blocks of 8 Kbits each.
Our implementation of one 32 bit Subleq processor requires a minimum of 2 memory blocks, so only 28 processors can be fitted into the FPGA.
We could choose 16 bit implementation and have more processors (up to 56), but with only 1 Kbyte of memory allocated to each.

The FPGA is connected to the USB bus with the help of an external Cypress FX2 CPU that is configured as a bridge between USB and SPI (Serial Peripheral Interface) utilised to load FPGA with code and data.
The interface bridge is transparent for the PC software.


    FPGA
MEMORY  <-> PROCESSOR1
MEMORY  <-> PROCESSOR2
MEMORY  <-> PROCESSOR3
  :       <-> SPI <-> CONTROL_CPU <-> USB
MEMORY  <-> PROCESSOR7
MEMORY  <-> PROCESSOR8

Figure 2 Block-diagram of the board


~ 6 ~

Page 7

540 :YAMAGUTIseisei:2018/09/09(日) 08:01:42.82 ID:vMpjqLBja ?2BP(3)
Figure 2 is a communication block diagram of the board.
The solution was coded in VHDL and compiled with Quartus II Web Edition software freely available from the Altera website.
Our code is scalable for up to 63 processors for possible use with larger FPGAs.
The limit of 63 processors is due to our SPI bus addressing implementation and can be increased, if necessary.

All 28 processors run independently and synchronized by a single 150 MHz clock generated by one of the FPGA PLLs from a reference oscillator fitted on the PCB.

To increase the number of processors, additional boards with FPGAs could be easily connected via USB bus.

3.2 Interface description

Each processor has a serial interface to the allocated memory and the Status byte, accessible from a single address serial loading.
The serial interface takes over memory's data and address buses when processing is stopped.

Address space inside the FPGA is organised as a table addressed by two numbers: processor index and memory address.
Reading one byte from index 0 returns the number of processors inside the FPGA.
For this design the returned value is 28.
Indices from 1 to 28 are assigned to the processors, which have 2048 bytes (512 of 32 bit words) of memory available to each.

Writing to a processor memory is an operation which sequentially loads a buffer of 2048 bytes.
Reading from the processor’s memory is different: the first word (4 bytes) returned is the status of the processor and the rest is the memory content.

541 :YAMAGUTIseisei:2018/09/09(日) 08:02:33.56 ID:vMpjqLBja ?2BP(3)
The Status byte ? the first byte of the first word ? can be in one of three states: 0xA1 ? running, 0xA2 ? stopped, or 0xA0 ? stopped and not run since power on.
Writing into the processor’s memory automatically starts execution, thus eliminating the need in a separate command.
Reading from a processor’s memory stops that processor.
An exception is reading the first byte of status, which does not stop the processor.
Additionally, a processor can be stopped by Subleq halt operand (-1) as mentioned in Section 2.
Other negative references, such as input or output as described above in the Subleq assembly language section, also stop the processor, because no IO operations are defined in this architecture.

3.3 Subleq processor

The state machine algorithm can be presented in pseudocode as:

  IP = 0
  while (IP >= 0)
  {
    A = memory[IP]
    B = memory[IP+1]
    C = memory[IP+2]
    if (A < 0 or B < 0):
    {
      IP = -1
    }
    else:
    {
      memory[B] = memory[B] - memory[A]
      if (memory[B] > 0)
        IP = IP + 3
      else:
        IP = C
      }
    }

542 :YAMAGUTIseisei:2018/09/09(日) 08:03:12.40 ID:vMpjqLBja ?2BP(3)
~ 7 ~

Page 8


where IP is an instruction pointer, memory[] is a value of a memory cell, and A, B, and C are integers.

The Subleq processor core is written with the help of RAM 2 Port Megafunction of Quartus II software we used to build dual port memory access.
The implemented solution allows access to the content by two distinct addresses (memory[A] and memory[B]) simultaneously, which results in saving on processing clock ticks.
The disadvantage of this implementation is an additional latency of one clock tick to access the data and address buses compared to a single port memory implementation.
However the total of processing clock ticks per memory access for the dual port is still less than that required for a single port.

The core is based on a state machine, which starts automatically when the memory of a processor is loaded.
On any read or write operation, or encountering a negative operand the processing stops, but the first status byte can be read at any time without affecting the computation.

4. C Compiler for Subleq

In this section we briefly describe some elements of the compiler we wrote, which compiles simplified C code into Subleq [11].
The compiler is used in one of our tests, so that direct comparison is possible between execution of a compiled native C code and a Subleq code, compiled from the same C source.
The compiler is a high-level language interface to OISC ? the only known to us such compiler at the time of writing.

4.1 Stack

The primary C programming language concepts are functions and the stack.
Implementation of the stack in Subleq is achievable by using memory below the code.
Using code self-modification one can place into and retrieve from the stack values.
Function calls require return address to be placed into the stack.
Consider the following C code:

543 :YAMAGUTIseisei:2018/09/09(日) 08:06:52.07 ID:vMpjqLBja ?2BP(3)
  void f()
  {
    ...
  }

  void main()
  {
    ...
  }

  f();
     ...
  }


~ 8 ~

Page 9


After the above is compiled to machine code, it must perform the following operations
1) the address of the instruction immediately after calling f has to be put on the stack;
2) a jump to the code of the function f must be made
3) at the end of the function f, the address from the stack needs to be extracted and
4) the execution should be transferred to the extracted address.
According to C standard, the function main is a proper C function, i.e. it can be called from other functions including itself.
Hence the program must have a separate entry point, which in the following code is called sqmain.
The C code above compiles into:

544 :YAMAGUTIseisei:2018/09/09(日) 08:08:07.99 ID:vMpjqLBja ?2BP(3)
    0 0 sqmain
  _f:
    ...
    #return ?+8; sp ?+4; ?+7; 0 ?+3; Z Z 0
  _main:
    ...
    #call f
    dec sp; ?+11; sp ?+7; ?+6; sp ?+2; 0
    ?+6; sp ?+2; ?+2 0 _f; . ?; inc sp
    ...
    #return
    ?+8; sp ?+4; ?+7; 0 ?+3; Z Z 0
  sqmain:
    #call main
    dec sp; ?+11; sp ?+7; ?+6; sp ?+2; 0
    ?+6; sp ?+2; ?+2 0 _main; . ?; inc sp
    0 0 (-1)
  . inc:-1 Z:0 dec:1 sp:-sp

The cell stack pointer sp is the last memory cell in the program.
It is initialised with the negative value of its own address.
A negative value is used here to speed up the code execution ? working with subtraction operation may sometimes save a few steps if the data is recorded as negative of its actual value.
The instruction dec sp subtracts 1 from sp, hence increasing its real value by 1.
Below is an excerpt calling the function f in more readable form ? relative references ? are replaced with labels.

545 :YAMAGUTIseisei:2018/09/09(日) 08:13:11.96 ID:vMpjqLBja ?2BP(3)
  dec sp
  A; sp A
  B; sp B
  A:0 B:0
  C; sp C
  D C:0 _f
  . D:?
  inc sp

The instruction on the fourth line is to clear the cell in the stack, since some values can be left there from the previous use.
However, to clear the top cell in the stack is not a single-step task because one has to clear the operands of the instruction itself and then initialise it with the value of sp pointer.
Thus the execution command sequence is the following:
allocate a new cell in the stack by increasing stack pointer (first line);
clear the first operand of the instruction;
initialise this operand with the new value of the stack pointer (second line);
do the same with the second operand of the instruction ? clear and initialise (third line); and then execute the instruction, which will clear the allocated cell in the stack (forth line).

The next two instructions clear and initialise cell C similarly.
The instruction D C:0 _f copies the address of the instruction inc sp to the stack and jumps to _f.
This works, because D holds the value of the next memory cell (remember ?) and C points to the now cleared top cell on the stack.


~ 9 ~

Page 10


A negative value written to the stack then forces a jump to the label _f.

Once inside the function f, the stack pointer can be modified, but we assume that functions restore it before they exit.
So the return code has to jump to the address extracted from the stack:

546 :YAMAGUTIseisei:2018/09/09(日) 08:13:40.63 ID:vMpjqLBja ?2BP(3)
  A; sp A
  B; A:0 B
  Z Z B:0

Here the value of stack pointer sp is written to A, and the instruction A:0 B copies the stored address to B.
The address stored negatively so the positive value is being restored.

The stack does a little bit more than just storing return addresses.
That will be explored later in subsections 4.3 and 4.4.

4.2 Expressions

C language operations consist of statements, which in turn consist of keyword statements and expressions.
The syntax of keyword statements and expressions are best represented by Backus-Naur Forms (BNF) ? the standard way of representing context-free grammars.
A classic example is the grammar of arithmetic expressions:

  expression:=
    term
      expression + term
      expression - term
    term:=
      primary
      term * primary
      term / primary
    primary:=
      identifier
      constant
      ( expression )

These mutually recursive definitions can be used by a program called parser to build a tree representation for any grammatically valid expression.
Once such a tree is built, the compiler’s job is to organise the sequence of instructions so that the result of any sub-tree is passed up the tree.
For example, a tree of the expression:

547 :YAMAGUTIseisei:2018/09/09(日) 08:14:35.49 ID:vMpjqLBja ?2BP(3)
  a + ( b ? c )


Figure


consists of a node ‘+’, variable a and a sub-tree, which consists of a node ‘-’, and variables b and c.
To make a calculation, the compiler must use a temporary variable to store the result of the sub-tree, which has to be used later in addition; and potentially to be used further up if this expression is a part of a larger expression.
In this particular example we need only one temporary, but generally many temporaries are required.
The expression is compiled into the following code:

  t; b Z; Z t; Z
  c t
  a Z; Z t; Z


~ 10 ~

Page 11


The first line copies value b into temporary t.
The second line subtracts value c from the temporary.
At this point the compiler is finished with the sub-tree.
Its result is the generated code and the temporary variable t holding the value of the calculated sub-tree.
Now the compiler generates code for the addition.
Its arguments now are variable a and temporary t.
The third line adds a to t.
Now t holds the result of the whole expression.
If this expression is a part of a larger expression, t is passed up the tree as an argument to the upper level node of the tree.
If not, then t value is discarded because the evaluation is finished.

548 :YAMAGUTIseisei:2018/09/09(日) 08:15:29.08 ID:vMpjqLBja ?2BP(3)
More advanced grammar may involve assignment, dereferencing, unary operations and many others.
But each grammar construction can be represented by a corresponding sub-tree, and processed later by the compiler to produce code.
For example, a subtraction from a dereferenced value represented in C as:

  *k -= a

has to be translated into

  t; k Z; Z t; Z
  a t:0

Here a temporary variable has to be used inside the code for dereferencing.
The sequence of instructions is: clear t; copy value k into t; subtract a from the memory k points to.

Here a few elements of the grammar processing have been touched.
C grammar takes several pages just to list the BNF.
However larger and more complex grammars are resolved by the compiler in similar manner.

4.3 Function calls

In the subsection 4.1 above it was shown how to push into and pop from the stack.
When a function takes arguments, they have to be pushed into the stack together with the return address.
The stack must be restored upon function return.
Consider a function taking two arguments:

  int f(int a, int b);
    ...
    f(a,b);

The call to a function f must be translated into something like this

549 :YAMAGUTIseisei:2018/09/09(日) 08:17:21.13 ID:vMpjqLBja ?2BP(3)
  # 1 push b
  # 2 push a
  # 3 push return_address
  # 4 goto f # return_address:
  # 5 sp -= 3

In C arguments can be expressions and the call to a function can be a part of another expression - sub-expression, i.e.
the compiler must properly handle more complicated cases like the following


~ 11 ~

Page 12


  int f(int a, int b)
  {
    ...
    return f;
  }
    ...
    int k;
    k=f;
    k(f(1,2),3); // call via variable ?   indirect call
    k = f(1,2)(3,4); // call by return value

Here for simplicity the C function type int(*)(int,int) is represented as int.
Subleq supports only one variable type.
Therefore, more elaborate typing system does not introduce extra functionality in the language.

550 :YAMAGUTIseisei:2018/09/09(日) 08:18:23.81 ID:vMpjqLBja ?2BP(3)
Arguments pushed into the stack can properly be calculated as sub-expressions (sub-tree).
In this sense for the actual function call it is irrelevant either program variables or temporary are pushed into the stack.

  # 1 push B
    # clearing the next cell in the stack [remember that sp is negative]
    # the line below is same as in C syntax: *(++sp)=0;
    dec sp; t1; sp t1; t2; sp t2; t1:0 t2:0
    # same as in C syntax: *sp+=B;
    t3; sp t3; b Z; Z t3:0; Z

  # 2 push A
    # the same with A
    dec sp; t4; sp t4; t5; sp t5; t4:0 t5:0
    t6; sp t6; a Z; Z t6:0; Z

  # 3 push return_address
    dec sp; t7; sp t7; t8; sp t8; t7:0 t8:0
    t9; sp t9; t10 t9:0 goto_address
    . t10: return_address

  # 4 goto f goto_address: Z Z f

  # 5 sp -= 3
    return_address: const(-3) sp

Notation const(-3) sp is a short for

  unique_name sp
  ...
  unique_name:-3

551 :YAMAGUTIseisei:2018/09/09(日) 08:21:26.97 ID:vMpjqLBja ?2BP(3)
The code above handles neither return value nor indirect calls yet.
The return value can be stored in a special variable (register).
If the program uses the return value in a sub-expression, then it must copy the value into a temporary immediately upon return.
Indirect calls can be achieved by dereferencing a temporary holding the address of the function.
It is straightforward, but more complex code.

Stack pointer can be modified inside a function when the function requests stack (local) variables.
For accessing local variables usually base pointer bp is used.
It is initialised on function entrance; is used as a base reference for local variables ? each local variable has an associated offset from base pointer; and is used to restore stack pointer at the end of the function.
Functions can call other functions, which means that each function must save upon entry and restore upon exit base pointer.
So the function body has to be wrapped with the following commands:


~ 12 ~

Page 13


  1. # push bp
  2. # sp -> bp
  3. # sp -= stack_size
  # ... function body
  5. # bp -> sp
  6. # pop bp
  7. # return

Or in Subleq code.

552 :YAMAGUTIseisei:2018/09/09(日) 08:23:41.07 ID:vMpjqLBja ?2BP(3)
  dec sp; ?+11; sp ?+7; ?+6; sp ?+2; 0
  ?+6; sp ?+2; bp 0
  bp; sp bp
  stack_size sp

  # ... function body

  sp; bp sp
  ?+8; sp ?+4; bp; 0 bp; inc sp
  ?+8; sp ?+4; ?+7; 0 ?+3; Z Z 0

stack_size is a constant, which is calculated for every function during parsing.
It turns out that it is not enough to save bp.
A function call can happen inside an expression.
In such case all temporaries of the expression have to be saved.
A new function will be using the same temporary memory cells for its own needs.
For the expression f()+g() the results of the calls may be stored in variables t1 and t2.
If function g changes t1 where the result of function f is stored, a problem would appear.

A solution is to make every function push all temporaries it is using onto the stack and to restore them upon exit.
Consider the following function:

  int g()
  {
    return k+1;
  }

It translates into:

553 :YAMAGUTIseisei:2018/09/09(日) 08:24:55.53 ID:vMpjqLBja ?2BP(3)
  _g:
    # save bp
    dec sp; ?+11; sp ?+7; ?+6; sp ?+2; 0
    ?+6; sp ?+2; bp 0
    bp; sp bp

    # push t1
    dec sp; ?+11; sp ?+7; ?+6; sp ?+2; 0
    ?+6; sp ?+2; t1 0
    # push t2
    dec sp; ?+11; sp ?+7; ?+6; sp ?+2; 0
    ?+6; sp ?+2; t2 0

    # calculate addition
    t1; t2
    _k t1
    dec t1
    t1 t2
    # set the return value [negative]
    ax; t2 ax

    # pop t2
    ?+8; sp ?+4; t2; 0 t2; inc sp
    # pop t1
    ?+8; sp ?+4; t1; 0 t1; inc sp

    # restore bp
    sp; bp sp
    ?+8; sp ?+4; bp; 0 bp; inc sp
    # exit
    ?+8; sp ?+4; ?+7; 0 ?+3; Z Z 0

554 :YAMAGUTIseisei:2018/09/09(日) 08:25:45.91 ID:vMpjqLBja ?2BP(3)
~ 13 ~

Page 14


If somewhere inside the code there are calls to other functions, the temporaries t1 and t2 hold their calculated values because other functions save and restore them when executed.
Since all used temporaries in the function are pushed into the stack, it pays off to reduce the number of used temporaries.
It is possible to do this just by releasing any used temporary into a pool of used temporaries.
Then later when a new temporary is requested, the pool is first checked and a new temporary is allocated only when the pool is empty.
The expression

  1+k[1]

compiles into

  t1; t2; _k t1; dec t1; t1 t2
  t3; t4; ?+11; t2 Z; Z ?+4; Z; 0 t3; t3 t4;
  t5; t6; dec t5; t4 t5; t5 t6
  # result in t6

When pool of temporaries is introduced the number of temporaries is halved:

  t1; t2; _k t1; dec t1; t1 t2
  t1; t3; ?+11; t2 Z; Z ?+4; Z; 0 t1; t1 t3
  t1; t2; dec t1; t3 t1; t1 t2
  # result in t2

which dramatically reduces the code removing corresponding push and pop operations.

555 :YAMAGUTIseisei:2018/09/09(日) 08:26:33.41 ID:vMpjqLBja ?2BP(3)
4.4 Stack variables

Once bp is placed on the stack and sp is decremented to allocate memory, all local variables become available.
They can be accessed only indirectly because the compiler does not know their addresses.
For example, the function f in

  int f(int x, int y)
  {
    int a, b=3, c[3], d=5;
    ...
  }
  f(7,9);

has 4 local variables with the stack size equal to 6.
When this function is entered the stack has the following values:

... y[9] x[7] [return_address] [saved_bp] a[?] b[3] c0[?] c1[?] c2[?] d[5] ...
              ^               ^
              (bp)               (sp)

The compiler knows about the offset of each variable from bp.


~ 14 ~

Page 15

556 :YAMAGUTIseisei:2018/09/09(日) 08:28:24.39 ID:vMpjqLBja ?2BP(3)
  Variable Offset
  y   -3
  x   -2
  a   1
  b   2
  c   3
  d   6

Hence, in the code any reference to a local variable, not pointing to an array, can be replaced with *(bp+offset).
The array c has to be replaced with (bp+offset) because the name of array is the address of its first element.
The name does not refer to a variable, but the referencing with [] does.
In C

  c[i]

is the same as

  *(c+i)

which can be interpreted in our example as

  *((bp+3)+i)

4.5 Multiplication

The only trivial multiplication in Subleq is multiplication by 2,

  t=a+a: t; a Z; a Z; Z t; Z

To multiply 2 numbers one can use the formula

557 :YAMAGUTIseisei:2018/09/09(日) 08:29:27.49 ID:vMpjqLBja ?2BP(3)
  A*B = (2A)*(B/2) + A*(B%2)

This is a simple recursive formula, but it requires integer and modular division.
Division can be implemented as the following algorithm.
Given two numbers A and B, B is increased by 2 until the next increase gives B greater then A.
At the same time as increasing B, we increase another variable I by 2, which has been initialized to 1.
When B becomes greater then A, I holds the part of the result of division ? the rest is to be calculated further using A-B and original B.
This can be done recursively accumulating all I's.
At the last step when A<B, A is the modulus.
This algorithm can be implemented as a short recursive function in C.
Upon the exit this function returns the integer division as the result and division modulus in the argument j.

  int a, int b, int * j)
  {

    if( a < b ) { *j=a; return 0; }

    int b1=b, i=1, bp, ip;

  next:
    bp = b1; ip = i;
    b1 *= 2; i *= 2;
    if( b1 > a )
      return ip+divMod(a-bp,b,j);
    goto next;
  }


~ 15 ~

Page 16

558 :YAMAGUTIseisei:2018/09/09(日) 08:30:57.51 ID:vMpjqLBja ?2BP(3)
This function is not optimal.
A more efficient function can be achieved by replacing recursion with another external loop.
Multiplication, integer and modular division operations requiring quite elaborate calculations can be implemented as library functions.
That is, each multiplication a*b can be replaced with a call _mul(a,b), and later the compiler may add (if necessary) the implementation of the function.

4.6 Conditional jump

In C, Boolean expressions which evaluate to zero are false and non-zero are true.
In Subleq this leads to longer code when handling Boolean expressions because every Boolean expression evaluates on the basis of equality or non-equality to zero.

A better approach is to treat less or equal to zero as false and a positive value as true.
Then if-expression if(expr){<body>} will be just one instruction

  Z t next
  <body>
  next: ...

where t is the result of the expression expr.
However to remain fully compatible with C (for example, if(x+1){...} ? an implicit conversion to Boolean) all cases where integer expression is used as Boolean have to be detected.
Fortunately there are only a few such cases:

  if(expr)
  while(expr)
  for(...,expr,...)
  ! expr
  expr1 && expr2
  expr1 || expr2

The job can be done inside the parser, so the compiler would not have to care about Boolean or integer expression, and it can produce much simpler code.

In cases when a Boolean variable is used in expressions as integer, like in:

559 :YAMAGUTIseisei:2018/09/09(日) 08:31:39.95 ID:vMpjqLBja ?2BP(3)
  passing an argument f(a>0)
  returning from a function return(a>0);
  assignment x=(a>0);
  other arithmetic expression x=1+5*(a>0);

the variable must be converted to C-style, i.e. negative result zeroed.
This can be done as simple as

  x Z ?+3; x; Z


Z x ? + 3
Z Z
x Z
Z

x > 0
x == 0
x < 0

Figure 3 Diagram representing conditional jumps.


~ 16 ~

Page 17


A terse check for a value being less than, equal to, or greater than zero is:

  Z x ?+3; Z Z G; x Z E; Z; L:

560 :YAMAGUTIseisei:2018/09/09(日) 08:32:36.33 ID:vMpjqLBja ?2BP(3)
where L, E, and G are addresses to pass the execution in cases when x is less than, equal to, or greater than zero respectively.
Figure 3 shows the schema of the execution.
Note, that x does not change and Z is zero on any exit!

5. Results



Figure 4 FPGA board, 28 Subleq processors with allocated 2 Kb per processor


Figure 4 shows our FPGA board powered via USB cable.
Sized about 5 x 7 centimetres the board implements 28 Subleq processors with allocated memory of 2 Kb per processor and running at clock frequency 150 MHz.

To test the efficiency of the board we chose two mathematical problems.
The first calculates the size of a function residue of an arithmetic group.
The second calculates modular double factorials.

5.1 Test #1

In the first test we selected a problem of finding the order of the function residue of the following process:

  xi +1 = 2 xi mod M
  yi +1 = 2( xi + yi ) mod M

where x and y are integers initialised to 1, mod is a modulo operation, M is some value.
Starting from the point (x0=1,y0=1) the equations generate a sequence of pairs.
We chose this problem because its solution is difficult, with answers often much greater than M (but less than M 2 ).
Number M was selected such, that the calculations could be completed in a few minutes.
When this sequence is sufficiently long, a new pair of generated numbers will eventually be the same as the pair previously generated in the sequence.
The task is to find how many steps have to be completed before the first occurrence of the result with the same value.
In our test the selected value of M was M=5039 and the number of iterations was calculated as 12693241.

561 :YAMAGUTIseisei:2018/09/09(日) 08:33:43.69 ID:vMpjqLBja ?2BP(3)
Page 17

~ 18 ~


A C program to solve this problem can be written without the use of multiplication or division:

  int x=1, y=1, m=5039;
  int c=0, ctr=1, t;
  int x0=0, y0=0;

  int printf();
  int main()
  {

    while(1)
    {
      y += x; y += y; x += x;
      while( x>=m ) x-=m;
      while( y>=m ) y-=m;

      if( x==x0 && y==y0 ) break;

      if( ++c==ctr )
      {
        x0=x; y0=y;
        c=0; ctr+=ctr;
      }
    }
    printf("point: %d %d loop: %d of %d\n",x0,y0,c+1,ctr);
  }

562 :YAMAGUTIseisei:2018/09/09(日) 08:35:14.91 ID:vMpjqLBja ?2BP(3)
This program has been tested in the following cases:

  1. Compiled with our Subleq compiler and run on one of the processors on FPGA board;
  2. Compiled with our Subleq compiler and emulated on PC#1 (Intel Q9650 at 3GHz)
  3. Compiled with Microsoft C/C++ compiler (v16) with full optimisation and run on PC#1.
  4. Same as 2, but run on PC#2 (Pentium 4 at 1.7GHz)
  5. Same as 3 run on PC#2

The table below shows execution time in seconds for each test.

  1 Subleq on 1 processor FPGA 94.0
  2 Subleq on PC#1 46.0
  3 C on PC#1 0.37
  4 Subleq on PC#2 216
  5 C on PC#2 0.54

From these results we conclude that the speed of a single processor on FPGA is of the same order of magnitude as the speed of CPU of ordinary PC when emulating Subleq instructions.
Native code on a PC runs about hundred times faster.

5.2 Test #2

The second test was the calculation of modular double factorials, namely

   N n
  (N !)! mod M = ?? i mod M
   n =1 i =1


Page 18

~ 19 ~

563 :YAMAGUTIseisei:2018/09/09(日) 08:36:23.18 ID:vMpjqLBja ?2BP(3)
In this test case we were able to use the full power of our multi-processor Subleq system because multiplication in the above equation could be calculated in parallel across all 28 processors.
For N=5029 and M=5039 the result is 95 and these numbers were used in the test.
Number M was same as in the Test#1 and number N was selected to give the result (95) in ASCII printable range.
The calculations were run in the following configurations:

  1. Hand written Subleq code run of FPGA board [Appendix 7.3]
  2. Subleq code emulated on PC (same as PC#1 in the first test)
  3. Equivalent C code compiled with the same C compiler and run on PC [Appendix 7.1]
  4. Same C code compiled with Subleq compiler and emulated on PC
  5. Equivalent C code without multiplication operation compiled with C compiler and run on PC [Appendix 7.2]
  6. Same C code as in 5 compiled with Subeq compiler and emulated on PC

The code we used was not 100% efficient, since the solution to the problem needs ~O(NlogN) operations if utilising modular exponentiation, rather than ~O(N 2 ) as presented in the Appendix.
However this is not important when evaluating relative performance.

The results are presented in the table below.
The values are execution time in seconds.

  1 Subleq on FPGA, parallel on 28 processors 62.0
  2 Subleq on PC (emulation) 865
  3 C with multiplication, executable run on PC 0.15
  4 C with multiplication, Subleq emulated on PC 12060
  5 C without multiplication, executable run on PC 7.8
  6 C without multiplication, Subleq emulated on PC 9795

The 28 FPGA processors easily outperform the emulation of the same Subleq code on a PC.
C code without multiplication compiled into Subleq and emulated runs faster than C code with multiplication, because the compiler’s library multiplication function is not as efficient as the multiplication function written in this example.

564 :YAMAGUTIseisei:2018/09/09(日) 08:40:24.80 ID:vMpjqLBja ?2BP(3)
6. Conclusion

Using an inexpensive Cyclone III FPGA we have successfully built an OISC multi-processor device with processors running in parallel.
Each processor has its own memory limited to 2 Kb.
Due to this limitation we were unable to build a multi-processor board with even simpler individual processor instruction set, such as e.g.
bit copying [2], because in that case, to run practically useful computational tasks the minimum required memory is ~ 1 Mb of memory per processor.
The limited memory available in our device also did not permit us to run more advanced programs, such as emulators of another processor or use more complex computational algorithms,
because all the computational code has to fit inside the memory allocated for each processor.


Page 19

~ 20 ~


The size of memory available to each processor can be increased by choosing larger and faster albeit more expensive FPGA such as Stratix V.
Then a faster processing clock and larger number of CPUs could be implemented as well.
The VHDL code of the CPU state machine could also be optimised improving computational speed.
Given sufficient memory, it would be possible to emulate any other processor architecture, and use algorithms written for other CPU’s or run an operating system.
Apart from the memory constrain, another downside of this minimalist approach was reduced speed.
Our board uses rather slow CPU clock speed of 150 MHz.
As mentioned above, more expensive FPGA can run at much faster clock speeds.

565 :YAMAGUTIseisei:2018/09/09(日) 08:41:29.57 ID:vMpjqLBja ?2BP(3)
On the other hand, the simplicity of our design allows for it to be implemented as a standalone miniature-scale multi-processor computer, thus reducing both physical size and energy consumption.
With proper hardware, it might also be possible to power such devices with low power solar batteries similar to those used in cheap calculators.
Our implementation is scalable ? it is easy to increase the number of processors by connecting additional boards without significant load on host’s power supply.
A host PC does not have to be fast to load the code and read back the results.
Since our implementation is FPGA based, it is possible to create other types of runtime re-loadable CPUs, customised for specific tasks by reprogramming FPGA.

In conclusion, we have demonstrated the feasibility of the OISC concept and applied it to building a functional prototype of OISC multi-processor system.
Our results demonstrate that with proper hardware and software implementation, a substantial computational power can be achieved already in a very simple OISC multi-processor design.


Page 21

~ 21 ~


7. Appendix

This section presents pieces of code calculating modular double factorial.

7.1 C with multiplication

The following C program calculates modular double factorial using built-in multiplication and division operations.

566 :YAMAGUTIseisei:2018/09/09(日) 08:42:50.59 ID:vMpjqLBja ?2BP(3)
1   int printf();
2   int main()
3   {
4     int a=5029;
5     int b=1;
6     int m=5039;
7     int x=1;
8     int i,j;
9
10     for( i=a; i>b; i-- )
11     for( j=1; j<=i; j++ )
12     x = (j*x)%m;
13
14     printf("%d",x);
15   }

Lines 10-12 is a double loop multiplying numbers from b to a modulo m.

7.2 C without multiplication

This C program does the same calculation as the program above in 7.1, but without built-in multiplication and division operations.
Multiplication and division functions are written explicitly.
en explicitly.

1   int DivMod(int a, int b, int *m)
2   {
3     int b1, i1, bp, ip;
4     int z = 0;
5
6   start:
7     if( a<b ){ *m=a; return z; }
8

567 :YAMAGUTIseisei:2018/09/09(日) 08:43:44.94 ID:vMpjqLBja ?2BP(3)
9     b1=b; i1=1;
10
11   next:
12     bp = b1; ip = i1;
13     b1 += b1; i1 += i1;
14
15     if( b1 > a )
16     {
17       a = a-bp;
18       z += ip;
19       goto start;
20     }
21
22     if( b1 < 0 ) return z;
23
24     goto next;
25   }
26
27   int Mult(int a, int b)
28   {
29     int dmm, r=0;
30
31     while(1)
32     {
33       if( !a ) return r;
34       a=DivMod(a,2,&dmm);
35       if( dmm ) r += b;
36       b += b;
37     }
38   }
39
40   int printf();

568 :YAMAGUTIseisei:2018/09/09(日) 08:45:16.48 ID:vMpjqLBja ?2BP(3)
41
42  int a=5029, b=1, m=5039;
43   int k=0, x=1, t;
44
45   int main()
46   {
47   start: k=a;
48   loop: t=Mult(k,x);
49     DivMod(t,m,&x);
50
51     if( --k ) goto loop;
52     if( --a > b ) goto start;
53
54     printf("%d",x);
55   }

Lines 1-25 implement the division algorithm described in 4.5, but optimised by removing recursive call.
The multiplication (lines 27-38) is a straightforward implementation of the formula shown in 4.5.


Page 21

~ 22 ~


C loops are replaced with goto statements to make process flow similar to Subleq implementation in the next subsection 7.3.

7.3 Subleq code

Subleq code calculating modular double factorials has been written manually, because the compiled Subleq from C did not fit into the memory.
The code below has 83 instructions, which can fit even into 1 Kb with 32-bit word.

569 :YAMAGUTIseisei:2018/09/09(日) 08:46:06.76 ID:vMpjqLBja ?2BP(3)
1   0 0 Start
2
3   . A:5029 B:1 MOD:5039
4   . Z:0 K:0 X:1
5
6   Start:
7   A Z; Z K; Z
8
9   Loop:
10   mu_a; K mu_a
11   mu_b; X mu_b
12
13
14   Mult:
15   mu_r
16
17   mu_begin:
18   t2; mu_a t2 mu_return:N2
19
20   dm_a; mu_a dm_a
21   dm_b; C2 dm_b
22   dm_return; N3L dm_return
23   t2 t2 DivMod
24
25   N3:
26   dm_m t2 ?+3
27
28   mu_b mu_r
29
30   mu_a; dm_z mu_a
31   mu_b Z; Z mu_b; Z Z mu_begin
32

570 :YAMAGUTIseisei:2018/09/09(日) 08:48:15.18 ID:vMpjqLBja ?2BP(3)
33   . mu_a:0 mu_b:0 mu_r:0
34
35   #Mult
36
37
38   N2:
39   dm_a; mu_r Z; Z dm_a; Z
40   dm_b; MOD dm_b
41
42   dm_return; N1L dm_return
43   Z Z DivMod
44
45   N1:
46   X; dm_m X
47
48   C1 K ?+3
49   Z Z Loop
50
51   C1 A
52   B A END
53   B Z; Z A; Z
54   K K Start
55
56   END:
57   X (-1)
58   Z Z (-1)
59
60   DivMod:
61
62   dm_z
63   dm_m
64

571 :YAMAGUTIseisei:2018/09/09(日) 08:48:55.68 ID:vMpjqLBja ?2BP(3)
65   dm_start:
66   t1; dm_b t1
67   dm_a t1 ?+6
68   dm_a dm_m; Z Z dm_return:0
69
70   dm_b1; dm_b Z; Z dm_b1; Z
71   dm_i1; C1 dm_i1
72
73  dm_next:
74   dm_bp; dm_b1 dm_bp
75   dm_ip; dm_i1 dm_ip
76
77   dm_b1 Z; Z dm_b1; Z
78   dm_i1 Z; Z dm_i1; Z
79   t1; dm_b1 t1
80   dm_a t1 dm_next
81
82   dm_bp dm_a
83   dm_ip Z; Z dm_z; Z Z dm_start
84
85   . dm_a:0 dm_b:0 dm_z:0
86   . dm_m:0 dm_b1:0 dm_ip:0
87   . dm_i1:0 dm_bp:0 t1:0
88
89   #divMod
90
91   . N1L:-N1 N3L:-N3 t2:0
92   . C1:1 C2:2 0

572 :YAMAGUTIseisei:2018/09/09(日) 08:51:10.16 ID:vMpjqLBja ?2BP(3)
Lines 3 and 4 define variables similar to how variables are defined in the C example above.
A defines the number for which double factorial is to be calculated; B is a starting number ? in our case it is 1, but in general it can be any number.
When the task is distributed among parallel processors the range B to A is broken into smaller ranges and submitted to the processors independently.
Upon completion the results collected and processed further.
MOD is modulus of the algorithm.
Z is Subleq zero register.
K is an intermediate value running from A to 1.
And X is the accumulated result.

Line 7 initialises K.
Lines 10 and 11 prepare formal arguments for the multiplication algorithm written between lines 14 and 35.
This code of the multiplication algorithm is almost one to one equivalent to the function Mult written in the previous sub-section.


Page 22

~ 23 ~


The only complex is that DivMod function organised here is a function so its code is being reused from the lines 23 and 43.
To make this possible one needs initialise formal arguments for the function as well as the return address.
The return address is copied via indirect labels N1L and N3L.

Lines 39 and 40 take the result from multiplication and initialise arguments for division.
Lines 42 and 43 initialise return address and call DivMod.
Line 46 extracts the result into X.

Lines 48 and 49 decrement K and check if it is less than 1.
If not, the whole iteration is repeated with K smaller by 1.
If K reached zero, proceed.

573 :YAMAGUTIseisei:2018/09/09(日) 08:51:32.29 ID:vMpjqLBja ?2BP(3)
Lines 51-54 decrement A and check if A has reached B.
If yes, we go to label END.
If not, we go to line 7 and repeat the whole process again but now with A reduced by, hence K starting from new value of A.

Line 57 crossed over is to print the result.
This instruction is handy when emulating Subleq.
But in calculating on the FPGA board this instruction does not exists because the board does not have concept of input-output operations.
The next line 58 is a valid Subleq halt command.

Lines 60-89 are corresponding Subleq code for the function DivMod presented in C in the sub-section above.

Finally, lines 91 and 92 define return addresses for calls to DivMod, a temporary t2, and two constants 1 and 2.
The later is required for division in the multiplication formula from 4.5.

References

1.
Jones, Douglas W.
(June 1988).
"The Ultimate RISC".
ACM SIGARCH Computer Architecture News (New York: ACM) 16 (3): 48?55.
2.
Oleg Mazonka, "Bit Copying: The Ultimate Computational Simplicity", Complex Systems Journal 2011, Vol 19, N3, pp. 263-285
3.
http://esolangs.org/wiki/TOGA_computer
4.
http://esolangs.org/wiki/ByteByteJump
5.
Derivative languages in the references section of http://esolangs.org/wiki/Subleq

574 :YAMAGUTIseisei:2018/09/12(水) 06:50:03.08 ID:Yfdxz6GgZ ?2BP(3)
6.
Mavaddat, F.; Parhami, B.
(October 1988).
"URISC: The Ultimate Reduced Instruction Set Computer". Int'l J. Electrical Engineering Education
(Manchester University Press) 25 (4):
327?334
7.
http://esolangs.org/wiki/Subleq
8.
http://da.vidr.cc/projects/subleq/
9.
http://www.sccs.swarthmore.edu/users/06/adem/engin/e25/finale/


Page 23

~ 24 ~


10.
http://techtinkering.com/articles/?id=22
11.
http://esolangs.org/wiki/Higher_Subleq

575 :YAMAGUTIseisei:2018/09/18(火) 21:31:03.83 ID:F/b4+koTS ?2BP(3)
Numenta Publishes a New Theory That Could Solve the Mystery of How the Brain Transforms Sensations Into Mental Objects 脳が感情を精神的目的にどのように変換するかの謎を解くことができる新しい理論をNumentaが発表
http://businesswire.com/news/home/20171115006003/en/Numenta-Publishes-New-Theory-Solve-Mystery-Brain
A Theory of How Columns in the Neocortex Enable Learning the Structure of the World新皮質の柱がどのように世界の構造を学習するかの理論
http://ncbi.nlm.nih.gov/pmc/articles/PMC5661005/
http://rio2016.5ch.net/test/read.cgi/future/1511446164/819

576 :YAMAGUTIseisei:2018/09/18(火) 21:44:42.66 ID:F/b4+koTS ?2BP(3)
Therefore, the capacity of the network is limited by the pooling capacity of the output layer.Mathematical analysis suggests that a single cortical column can store hundreds of objects before reaching this limit (see Supplementary Material).

To measure actual network capacity, we trained networks with an increasing number of objects and plotted recognition accuracy.
For a single cortical column, with 4,096 cells in the output layer and 150 mini-columns in the input layer, the recognition accuracy remains perfect up to 400 objects (Figure5A, blue).
The retrieval accuracy drops when the number of learned objects exceeds the capacity of the network.

Figure 5
Recognition accuracy is plotted as a function of the number of learned objects.
(A)Network capacity relative to number of mini-columns in the input layer.
The number of output cells is kept at 4,096 with 40 cells active at any time.
(B)Network capacity ...

From the mathematical analysis, we expect the capacity of the network to increase as the size of the input and output layers increase.We again tested our analysis through simulations.
With the number of active cells fixed,the capacity increases with the number of mini-columns in the input layer(Figure5A).
This is because with more cells in the input layer,the sparsity of activation increases, and it is less likely for an output cell to be falsely activated.
The capacity also significantly increases with the number of output cells when the size of the input layer is fixed(Figure5B).
This is because the number of feedforward connections per output cell decreases when there are more output cells available.
We found that if the size of individual columns is fixed, adding columns can increase capacity(Figure5C).
This is because the lateral connections in the output layer can help disambiguate inputs once individual cortical columns hit their capacity limit.However,this effect is limited;the incremental benefit of additional columns decreases rapidly.

577 :YAMAGUTIseisei:2018/09/18(火) 21:49:00.18 ID:F/b4+koTS ?2BP(3)
The above simulations demonstrate that it is possible for a single cortical column to model and recognize several hundred objects.
Capacity is most impacted by the number of cells in the input and output layers.
Increasing the number of columns has a marginal effect on capacity.
The primary benefit of multiple columns is to dramatically reduce the number of sensations needed to recognize objects.
A network with one column is like looking at the world through a straw; it can be done, but slowly and with difficulty.

Noise robustness
We evaluated robustness of a single column network to noise.
After the network learned a set of objects, we added varying amounts of random noise to the sensory and location inputs.
The noise affected the active bits in the input without changing its overall sparsity (see Materials and Methods).
Recognition accuracy after 30 touches is plotted as a function of noise (Figure6A).
There is no impact on the recognition accuracy up to 20% noise in the sensory input and 40% noise in the location input.
We also found that the convergence speed was impacted by noise in the location input (Figure6B).
It took more sensations to recognize the object when the location input is noisy.

Figure 6
Robustness of a single column network to noise.
(A)Recognition accuracy is plotted as a function of the amount of noise in the sensory input (blue) and in the location input (yellow).
(B) Recognition accuracy as a function of the number of sensations.

578 :yamaguti:2018/09/18(火) 21:50:26.30 ID:F/b4+koTS ?2BP(3)
Go to:
Mapping to biology
Anatomical evidence suggests that the sensorimotor inference model described above exists at least once in each column (layers 4 and 2/3) and perhaps twice (layers 6a and 5).
We adopt commonly used terminology to describe these layers.
This is a convenience as the connectivity and physiology of cell populations is what matters.
Cells we describe as residing in separate layers may actually intermingle in cortical tissue (Guy and Staiger, 2017).

Layers 4 and 2/3
The primary instance of the model involves layers 4 and 2/3 as illustrated in Figure ​Figure7A.7A.
The following properties evident in L4 and L2/3 match our model.
L4 cells receive direct thalamic input from sensory “core” regions (e.g., LGN; Douglas and Martin, 2004).
This input onto proximal dendrites exhibits driver properties (Viaene et al., 2011a).
L4 cells do not form long range connections within their layer (Luhmann et al., 1990).
L4 cells project to and activate cells in L2/3 (Lohmann and Rörig, 1994; Feldmeyer et al., 2002; Sarid et al., 2007), and receive feedback from L2/3 (Lefort et al., 2009; Markram et al., 2015).
L2/3 cells project long distances within their layer (Stettler et al., 2002; Hunt et al., 2011) and are also a major output of cortical columns (Douglas and Martin, 2004; Shipp, 2007).
It is known that L2/3 activation follows L4 activation (Constantinople and Bruno, 2013).

579 :yamaguti:2018/09/18(火) 21:50:54.03 ID:F/b4+koTS ?2BP(3)
Figure 7
Mapping of sensorimotor inference network onto experimentally observed cortical connections.
Arrows represent documented pathways.
(A) First instance of network; L4 is input layer, L2/3 is output layer.
Green arrows are feedforward pathway, from thalamo-cortical ...

The model predicts that a representation of location is input to the basal distal dendrites of the input layer.
A timing requirement of our model is that the location signal is a predictive signal that must precede the arrival of the sensory input.
This is illustrated by the red line in Figure ​Figure7A.7A.
About 45% of L4 synapses come from cells in L6a (Binzegger et al., 2004).
The axon terminals were found to show a strong preference for contacting basal dendrites (McGuire et al., 1984) and activation of L6a cells caused weak excitation of L4 cells (Kim et al., 2014).
Therefore, we propose that the location representation needed for the upper model comes from L6a.

580 :yamaguti:2018/09/18(火) 21:52:53.05 ID:F/b4+koTS ?2BP(3)
Layers 6a and 5
Another potential instance of the model is in layers 6a and 5 as illustrated in Figure ​Figure7B.7B.
The following properties evident in L6a and L5 match our model.
L6a cells receive direct thalamic input from sensory “core” regions (e.g., LGN; Thomson, 2010).
This input exhibits driver properties and resembles the thalamocortical projections to L4 (Viaene et al., 2011b).
L6a cells project to and activate cells in L5 (Thomson, 2010).
Recent experimental studies found that the axons of L6 CT neurons densely ramified within layer 5a in both visual and somatosensory cortices of the mouse,
and activation of these neurons generated large excitatory postsynaptic potentials (EPSPs) in pyramidal neurons in layer 5a (Kim et al., 2014).
L6a cells receive feedback from L5 (Thomson, 2010).
L5 cells project long distances within their layer (Schnepel et al., 2015) and L5 cells are also a major output of cortical columns (Douglas and Martin, 2004; Guillery and Sherman, 2011; Sherman and Guillery, 2011).
There are three types of pyramidal neurons in L5 (Kim et al., 2015).
Here we are referring to only one of them, the larger neurons with thick apical trunks that send an axon branch to relay cells in the thalamus (Ramaswamy and Markram, 2015).
However, there is also empirical evidence our model does not map cleanly to L6a and L5.
For example, Constantinople and Bruno (2013) have shown a sensory stimulus will often cause L5 cells to fire simultaneously or even slightly before L6 cells, which is inconsistent with the model.
Therefore, whether L6a and L5 can be interpreted as an instance of the model is unclear.

581 :yamaguti:2018/09/18(火) 21:56:04.78 ID:F/b4+koTS ?2BP(3)
Origin of location signal
The derivation of the location representation in L6a is unknown.
Part of the answer will involve local processing within the lower layers of the column and part will likely involve long range connections between corresponding regions in “what” and “where” pathways (Thomson, 2010).
Parallel “what” and “where” pathways exist in all the major sensory modalities (Ungerleider and Haxby, 1994; Ahveninen et al., 2006).
Evidence suggests that regions in “what” pathways form representations that exhibit increasing invariance to translation, rotation or scale and increasing selectivity to sensory features in object centered coordinates (Rust and DiCarlo, 2010).
This effect can be interpreted as forming allocentric representations.
In contrast, it has been proposed that regions in “where” pathways form representations in egocentric coordinates (Goodale and Milner, 1992).
If an egocentric motor behavior is generated in a “where” region, then a copy of the motor command will need to be sent to the corresponding “what” region where it can be converted to a new predicted allocentric location.
The conversion is dependent on the current position and orientation of the object relative to the body.
It is for this reason we suggest that the origin of the location signal might involve long-range connections between “where” and “what” regions.
In the Discussion section we will describe how the location might be generated.

582 :yamaguti:2018/09/18(火) 21:56:55.79 ID:F/b4+koTS ?2BP(3)
Physiological evidence
In addition to anatomical support, there are several physiological predictions of the model that are supported by empirical observation.
L4 and L6a cells exhibit “simple” receptive fields (RFs) while L2/3 and L5 cells exhibit “complex” RFs (Hubel and Wiesel, 1962; Gilbert, 1977).
Key properties of complex cells include RFs influenced by a wider area of sensory input and increased temporal stability (Movshon et al., 1978).
L2/3 cells have receptive fields that are twice the size of L4 cells in the primary somatosensory cortex (Chapin, 1986).
A distinct group of cells with large and non-oriented receptive fields were found mostly in layer 5 of the visual cortex (Mangini and Pearlman, 1980; Lemmon and Pearlman, 1981).
These properties are consistent with, and observed, in the output layer of our model.

The model predicts that cells in a mini-column in the input layer (L4 and L6a) will have nearly identical RFs when presented with an input than cannot be predicted as part of a previously learned object.
However, in the context of learned objects, the cells in a mini-column will differentiate.
One key differentiation is that individual cells will respond only in specific contexts.
This differentiation has been observed in multiple modalities (Vinje and Gallant, 2002; Yen et al., 2006; Martin and Schröder, 2013; Gavornik and Bear, 2014).
Our model is also consistent with findings that early sensory areas are biased toward recent perceptual recognition results (St.
John-Saaltink et al., 2016).

583 :yamaguti:2018/09/18(火) 21:57:27.83 ID:F/b4+koTS ?2BP(3)
A particularly relevant version of this phenomenon is “border ownership” (Zhou et al., 2000).
Cells which have similar classic receptive fields when presented with isolated edge-like features, diverge, and fire uniquely when the feature is part of a larger object.
Specifically, the cells fire when the feature is at a particular location on a complex object, a behavior predicted and exhibited by our model.
To explain border ownership, researchers have proposed a layer of cells that perform “grouping” of inputs.
The grouping cells are stable over time (Craft et al., 2007).
The output layer of our model performs this function.
“Border ownership” is a form of complex object modeling.
It has been observed in both primary and secondary sensory regions (Zhou et al., 2000).
We predict that similar properties can be observed in primary and secondary sensory regions for even more complex and three-dimensional objects.

Lee et al.
show that enhancement of motor cortex activity facilitates sensory-evoked responses of topographically aligned neurons in primary somatosensory cortex (Lee et al., 2008).
Specifically, they found that S1 corticothalamic neurons in whisker/barrel cortex responded more robustly to whisker deflections when motor cortex activity was focally enhanced.
This supports the model hypothesis that behaviorally-generated location information projects in a column-by-column fashion to primary sensory regions.

584 :yamaguti:2018/09/18(火) 21:59:33.62 ID:F/b4+koTS ?2BP(3)
Go to:
Discussion
Relationship with previous models
Due to the development of new experimental techniques, knowledge of the laminar circuitry of the cortex continues to grow (Thomson and Bannister, 2003; Thomson and Lamy, 2007).
It is now possible to reconstruct and simulate the circuitry in an entire cortical column (Markram et al., 2015).
Over the years, numerous efforts have been undertaken to develop models of cortical columns.
Many cortical column models aim to explain neurophysiological properties of the cortex.
For example, based on their studies on the cat visual cortex (Douglas and Martin, 1991), provided one of the first canonical microcircuit models of a cortical column.
This model explains intracellular responses to pulsed visual stimulations and has remained highly influential (Douglas and Martin, 2004).
Hill and Tononi (2004) constructed a large-scale model of point neurons that are organized in a repeating columnar structure to explain the difference of brain states during sleep and wakefulness.
Traub et al.
(2004) developed a single-column network model based on multi-compartmental biophysical models to explain oscillatory, epileptic, and sleeplike phenomena.
Haeusler and Maass (2007) compared cortical microcircuit models with and without the lamina-specific structure and demonstrated several computational advantages of more realistic cortical microcircuit models.
Reimann et al.
(2013) showed that the neocortical local field potentials can be explained by a cortical column model composed of >12,000 reconstructed multi-compartmental neurons.

585 :yamaguti:2018/09/18(火) 22:03:33.31 ID:F/b4+koTS ?2BP(3)
Although these models provided important insights on the origin of neurophysiological signals, there are relatively few models proposing the functional roles of layers and columns.
Bastos et al.
(2012) discussed the correspondence between the micro-circuitry of the cortical column and the connectivity implied by predictive coding.
This study used a coarse microcircuit model based on the work of Douglas and Martin (2004) and lacked recent experimental evidence and detailed connectivity patterns across columns.

Raizada and Grossberg (2003) described the LAMINART model to explain how attention might be implemented in the visual cortex.
This study highlighted the anatomical connections of the L4-L2/3 network and proposed that perceptual grouping relies on long-range lateral connections in L2/3.
This is consistent with our proposal of the stable object representation in L2/3.
A recent theory of optimal context integration proposes that long-range lateral connections are used to optimally integrate information from the surround (Iyer and Mihalas, 2017).
The structure of their model is broadly consistent with the theories presented here, and provides a possible mathematical basis for further analysis.

The benefit of cortical columns
Our research has been guided by Mountcastle's definition of a cortical column (Mountcastle, 1978, 1997), as a structure “formed by many mini-columns bound together by short-range horizontal connections.”
The concept plays an essential role in the theory presented in this paper.
Part of our theory is that each repetitive unit, or “column,” of sensory cortex can learn complete objects by locally integrating sensory and location data over time.
In addition, we have proposed that multiple cortical columns greatly speed up inference and recognition time by integrating information in parallel across dispersed sensory areas.

586 :yamaguti:2018/09/18(火) 22:10:09.20 ID:F/b4+koTS ?2BP(3)
An open issue is the exact anatomical organization of columns.
We have chosen to describe a model of columns with discrete inter-column boundaries.
This type of well-defined structure is most clear in the rat barrel cortex (Lubke et al., 2000; Bureau et al., 2004; Feldmeyer et al., 2013)
but Mountcastle and others have pointed out that although there are occasional discontinuities in physiological and anatomical properties,
there is a diverse range of structures and the more general rule is continuity (Mountcastle, 1978; Horton and Adams, 2005; Rockland, 2010).

Mountcastle's concept of a repetitive functional unit, whether continuous or discrete, is useful to understand the principles of cortical function.
Our model assigns a computational benefit to columns, that of integrating discontinuous information in parallel across disparate areas.
This basic capability is independent of any specific type of column (such as, hypercolumns or ocular dominance columns), and independent of discrete or continuous structures.
The key requirement is that each column models a different subset of sensory space and is exposed to different parts of the world as sensors move.

--
This is clear in cortex but Mountcastle have pointed out that although there are occasional discontinuities in physiological and anatomical properties, there is a diverse range of structures and the more general rule is continuity.

587 :yamaguti:2018/09/18(火) 22:11:15.12 ID:F/b4+koTS ?2BP(3)
Generating the location signal
A key prediction of our model is the presence of a location signal in each column of a cortical region.
We deduced the need for this signal based on the observation that cortical regions predict new sensory inputs due to movement (Duhamel et al., 1992; Nakamura and Colby, 2002; Li and DiCarlo, 2008).
To predict the next sensory input, a patch of neocortex needs to know where a sensor will be on a sensed object after a movement is completed.
The prediction of location must be done separately for each part of a sensor array.
For example, for the brain to predict what each finger will feel on a given object, it has to predict a separate allocentric location for each finger.
There are dozens of semi-independent areas of sensation on each hand, each of which can sense a different location and feature on an object.
Thus, the allocentric location signals must be computed in a part of the brain where somatic topology is similarly granular.
For touch, this suggests the derivation of allocentric location is occurring in each column throughout primary regions such as, S1 and S2.
The same argument holds for primary visual regions, as each patch of the retina observes different parts of objects.

588 :yamaguti:2018/09/18(火) 22:15:52.69 ID:F/b4+koTS ?2BP(3)
Although we don't know how the location signal is generated, we can list some theoretically-derived requirements.
A column needs to know its current location on an object, but it also needs to predict what its new location will be after a movement is completed.
To translate an egocentric motor signal into a predicted allocentric location, a column must also know the orientation of the object relative to the body part doing the moving.
This can be expressed in the pseudo-equation [current location + orientation of object + movement ≥ predicted new location].
This is a complicated task for neurons to perform.
Fortunately, it is highly analogous to what grid cells do.
Grid cells are a proof that neurons can perform these types of transformations, and they suggest specific mechanisms that might be deployed in cortical columns.

1.
Grid cells in the entorhinal cortex (Hafting et al., 2005; Moser et al., 2008) encode the location of an animal's body relative to an external environment.
A sensory cortical column needs to encode the location of a part of the animal's body (a sensory patch) relative to an external object.

2.
Grid cells use path integration to predict a new location due to movement (Kropff et al., 2015).
A column must also use path integration to predict a new location due to movement.

3.
To predict a new location, grid cells combine current location, with movement, with head direction cells (Moser et al., 2014).
Head direction cells represent the “orientation” of the “animal” relative to an external environment.
Columns need a representation of the “orientation” of a “sensory patch” relative to an external object.

4.
The representation of space using grid cells is dimensionless.
The dimensionality of the space they represent is defined by the tiling of grid cells, combined with how the tiling maps to behavior.
Similarly, our model uses representations of location that are dimensionless.

589 :yamaguti:2018/09/18(火) 22:17:38.66 ID:F/b4+koTS ?2BP(3)
These analogs, plus the fact that grid cells are phylogenetically older than the neocortex, lead us to hypothesize that the cellular mechanisms used by grid cells were preserved and replicated in the sub-granular layers of each cortical column.
It is not clear if a column needs neurons that are analogous to place cells (Moser et al., 2015).
Place cells are believed to associate a location (derived from grid cells) with features and events.
They are believed to be important for episodic memory.
Presently, we don't see an analogous requirement in cortical columns.

590 :yamaguti:2018/09/18(火) 22:18:35.23 ID:F/b4+koTS ?2BP(3)
Today we have no direct empirical evidence to support the hypothesis of grid-cell like functionality in each cortical column.
We have only indirect evidence.
For example, to compute location, cortical columns must receive dynamically updated inputs regarding body pose.
There is now significant evidence that cells in numerous cortical areas, including sensory regions, are modulated by body movement and position.
Primary visual and auditory regions contain neurons that are modulated by eye position (Trotter and Celebrini, 1999; Werner-Reiss et al., 2003) as do areas MT, MST, and V4 (Bremmer, 2000; DeSouza et al., 2002).
Cells in frontal eye fields (FEF) respond to auditory stimuli in an eye-centered frame of reference (Russo and Bruce, 1994).
Posterior parietal cortex (PPC) represents multiple frames of reference including head-centered (Andersen et al., 1993) and body-centered (Duhamel et al., 1992; Brotchie et al., 1995, 2003; Bolognini and Maravita, 2007) representations.
Motor areas also contain a diverse range of reference frames, from representations of external space independent of body pose to representations of specific groups of muscles (Graziano and Gross, 1998; Kakei et al., 2003).
Many of these representations are granular, specific to particular body areas, and multisensory, implying numerous transformations are occurring in parallel (Graziano et al., 1997; Graziano and Gross, 1998; Rizzolatti et al., 2014).
Some models have shown that the above information can be used to perform coordinate transformations (Zipser and Andersen, 1988; Pouget and Snyder, 2000).

Determining how columns derive the allocentric location signal is a current focus of our research.

591 :yamaguti:2018/09/18(火) 22:19:13.83 ID:F/b4+koTS ?2BP(3)
Role of inhibitory neurons
There are several aspects of our model that require inhibition.
In the input layer, neurons in mini-columns mutually inhibit each other.
Specifically, neurons that are partially depolarized (in the predictive state) generate a first action potential slightly before cells that are not partially depolarized.
Cells that spike first prevent other nearby cells from firing.
This requires a very fast, winner-take-all type of inhibition among nearby cells, and suggests that such fast inhibitory neurons contain stimulus-related information, which is consistent with recent experiment findings (Reyes-Puerta et al., 2015a,b).
Simulations of the timing requirement for this inhibition can be found in Billaudelle and Ahmad (2015).
Activations in the output layer do not require very fast inhibition.
Instead, a broad inhibition within the layer is needed to maintain the sparsity of activation patterns.
Experiment evidence for both fast and broad inhibition have been reported in the literature (Helmstaedter et al., 2009; Meyer et al., 2011).

Our simulations do not model inhibitory neurons as individual cells.
The functions of inhibitory neurons are encoded in the activation rules of the model.
A more detailed mapping to specific inhibitory neuron types is an area for future research.

Hierarchy
The neocortex processes sensory input in a series of hierarchically arranged regions.
As input ascends from region to region, cells respond to larger areas of the sensory array and to more complex features.
A common assumption is that complete objects can only be recognized at a level in the hierarchy where cells respond to input over the entire sensory array.

592 :yamaguti:2018/09/18(火) 22:25:26.84 ID:F/b4+koTS ?2BP(3)
Our model proposes an alternate view.
All cortical columns, even columns in primary sensory regions, are capable of learning representations of complete objects.
However, our network model is limited by the spatial extent of the horizontal connections in the output layer.
Therefore, hierarchy is still required in many situations.
For example, say we present an image of a printed letter on the retina.
If the letter occupies a small part of the retina, then columns in V1 could recognize the letter.
If, however, the letter is expanded to occupy a large part of the retina,
then columns in V1 would no longer be able to recognize the letter because the features that define the letter are too far apart to be integrated by the horizontal connections in L2/3.
In this case, a converging input onto a higher cortical region would be required to recognize the letter.
Thus, the cortex learns multiple models of objects, both within a region and across hierarchical levels.

What would occur if multiple objects were being sensed at the same time?
In our model, one part of a sensory array could be sensing one object and another part of the sensory array could be sensing a different object.
Difficulty would arise if the sensations from two or more objects were overlaid or interspersed on a region, such as, if your index and ring finger touched one object while your thumb and middle finger touched another object.
In these situations, we suspect the system would settle on one interpretation or the other.

--
If, however, the letter is expanded to occupy a large part of the retina, then columns in V1 would no longer be able to recognize the letter because the features that define the letter are too far apart to be integrated by the connections in L.
--
Difficulty would arise if the sensations from two objects were overlaid on a region, such as, if your index and ring finger touched one object while your thumb and middle finger touched another object.

593 :yamaguti:2018/09/18(火) 22:27:46.03 ID:F/b4+koTS ?2BP(3)
Sensory information is processed in parallel pathways, sometimes referred to as “what” and “where” pathways.
We propose that our object recognition model exists in “what” regions, which are associated with the ability to recognize objects.
How might we interpret “where” pathways in light of our model? First, the anatomy in the two pathways is similar.
This suggests that “what” and “where” regions perform similar operations, but achieve different results by processing different types of data.
For example, our network might learn models of ego-centric space if the location signal represented ego-centric locations.
Second, we suspect that bi-directional connections between what and where regions are required for converting ego-centric motor behaviors into allocentric locations.
We are currently exploring these ideas.

594 :yamaguti:2018/09/18(火) 22:28:32.19 ID:F/b4+koTS ?2BP(3)
Vision, audition, and beyond
We described our model using somatic sensation.
Does it apply to other sensory modalities? We believe it does.
Consider vision.
Vision and touch are both based on an array of receptors topologically mapped to an array of cortical columns.
The retina is not like a camera.
The blind spot and blood vessels prevent all parts of an object from being sensed simultaneously, and the density of receptors in the retina is not uniform.
Similarly, the skin cannot sense all parts of an object at once, and the distribution of somatic receptors is not uniform.
Our model is indifferent to discontinuities and non-uniformities.
Both the skin and retina move, exposing cortical columns to different parts of sensed objects over time.
The methods for determining the allocentric location signal for touch and vision would differ somewhat.
Somatic sensation has access to richer proprioceptive inputs, whereas vision has access to other clues such as, ocular disparity.
Aside from differences in how allocentric location is determined, our model is indifferent to the underlying sensory modality.
Indeed, columns receiving visual input could be interspersed with columns receiving somatic input, and the long-range intercolumn connections in our model would unite these into a single object representation.

595 :yamaguti:2018/09/18(火) 22:31:02.38 ID:F/b4+koTS ?2BP(3)
Similar parallels can be made for audition.
Perhaps the more powerful observation is that the anatomy supporting our model exists in most, if not all, cortical regions.
This suggests that no matter what kind of information a region is processing, its feedforward input is interpreted in the context of a location.
This would apply to high-level concepts as well as low-level sensory data.
This hints at why it is easier to memorize a list of items when they are mentally associated with physical locations, and why we often use mental imagery to convey abstract concepts.

Testable predictions
A number of experimentally testable predictions follow from this theory.

1.
The theory predicts that sensory regions will contain cells that are stable over movements of a sensor while sensing a familiar object.

2.
The set of stable cells will be both sparse and specific to object identity.
The cells that are stable for a given object will in general have very low overlap with those that are stable for a completely different object.

3.
Layers 2/3 of cortical columns will be able to independently learn and model complete objects.
We expect that the complexity of the objects a column can model will be related to the extent of long-range lateral connections.

4.
Activity within the output layer of each cortical column (layers 2/3) will become sparser as more evidence is accumulated for an object.
Activity in the output layer will be denser for ambiguous objects.
These effects will only be seen when the animal is freely observing familiar objects.

5.
These output layers will form stable representations.
In general, their activity will be more stable than layers without long-range connections.

6.
Activity within the output layers will converge on a stable representation slower with long-range lateral connections disabled, or with input to adjacent columns disabled.

596 :yamaguti:2018/09/18(火) 22:34:10.70 ID:F/b4+koTS ?2BP(3)
7.
The theory provides an algorithmic explanation for border ownership cells (Zhou et al., 2000).
In general each region will contain cells tuned to the location of features in the object's reference frame.
We expect to see these representations in layer 4.


Summary
Our research has focused on how the brain makes predictions of sensory inputs.
Starting with the premise that all sensory regions make predictions of their constantly changing input, we deduced that each small area in a sensory region must have access to a location signal that represents where on an object the column is sensing.
Building on this idea, we deduced the probable function of several cellular layers and are beginning to understand what cortical columns in their entirety might be doing.
Although there are many things we don't understand, the big picture is increasingly clear.
We believe each cortical column learns a model of “its” world, of what it can sense.
A single column learns the structure of many objects and the behaviors that can be applied to those objects.
Through intra-laminar and long-range cortical-cortical connections, columns that are sensing the same object can resolve ambiguity.

In 1978 Vernon Mountcastle reasoned that since the complex anatomy of cortical columns is similar in all of the neocortex, then all areas of the neocortex must be performing a similar function (Mountcastle, 1978).
His hypothesis remains controversial partly because we haven't been able to identify what functions a cortical column performs, and partly
because it has been hard to imagine what single complex function is applicable to all sensory and cognitive processes.

--
His hypothesis remains controversial partly because we haven't been able to identify what functions a partly because it has been hard to imagine what single complex function is applicable to all sensory and cognitive processes.

597 :yamaguti:2018/09/18(火) 22:45:59.69 ID:F/b4+koTS ?2BP(3)
The model of a cortical column presented in this paper is described in terms of a sensory regions and sensory processing, but the circuitry underlying our model exists in all cortical regions.
Thus, if Mountcastle's conjecture is correct, even high-level cognitive functions, such as, mathematics, language, and science would be implemented in this framework.
It suggests that even abstract knowledge is stored in relation to some form of `location' and that much of what we consider to be `thought' is implemented by inference and behavior generating mechanisms originally evolved to move and infer with fingers
and eyes.

Go to:
Materials and methods
Here we formally describe the activation and learning rules for the HTM sensorimotor inference network.
We use a modified version of the HTM neuron model (Hawkins and Ahmad, 2016) in the network.
There are three basic aspects of the algorithm: initialization, computing cell states, and learning.
These steps are described along with implementation and simulation details.

Notation
Let Nin represent the number of mini-columns in the input layer, M the number of cells per mini-column in the input layer, Nout the number of cells in the output layer and Nc the number of cortical columns.
The number of cells in the input layer and output layer is MNin and Nout, respectively, for each cortical column.
Each input cell receives both the sensory input and a contextual input that corresponds to the location signal.
The location signal is a Next dimensional sparse vector L.

598 :yamaguti:2018/09/18(火) 22:48:27.72 ID:F/b4+koTS ?2BP(3)
Each cell can be in one of three states: active, predictive, or inactive.
We use M × Nin binary matrices Ain and Πin to denote activation state and predictive state of input cells and use the Nout dimensional binary vector Aoutto denote the activation state of the output cells in a cortical column.
The concatenated output of all cortical columns is represented as a NoutNcolumn dimensional binary vector
At any point in time there are only a small number of cells active, so these are generally very sparse.

Each cell maintains a single proximal dendritic segment and a set of basal distal dendritic segments (denoted as basal below).
Proximal segments contain feedforward connections to that cell.
Basal segments represent contextual input.
The contextual input acts as a tiebreaker and biases the cell to win.
The contextual input to a cell in the input layer is a vector representing the external location signal L.
The contextual input to a cell in the output layer comes from other output cells in the same or different cortical columns.

For each dendritic segment, we maintain a set of “potential” synapses between the dendritic segment and other cells that could potentially form a synapse with it (Chklovskii et al., 2004; Hawkins and Ahmad, 2016).
Learning is modeled by the growth of new synapses from this set of potential synapses.
A “permanence” value is assigned to each potential synapse and represents the growth of the synapse.
Potential synapses are represented by permanence values greater than zero.
A permanence value close to zero represents an unconnected synapse that is not fully grown.
A permanence value greater than the connection threshold represents a connected synapse.
Learning occurs by incrementing or decrementing permanence values.

599 :yamaguti:2018/09/18(火) 22:50:40.71 ID:F/b4+koTS ?2BP(3)
We denote the synaptic permanences of the dth dendritic segment of the ith input cell in the jth mini-column as a Next × 1 vector Dijd, in.
Similarly, the permanences of the dth dendritic segment of the ith output cell is the NoutNc × 1 dimensional vector Did, out.

Output neurons receive feedforward connections from input neurons within the same cortical column.
We denote these connections with a M × Nin × Nout tensor F, where fijk represents the permanence of the synapse between the ith input cell in the jth mini-column and the kth output cell.

For D and F, we will use a dot (e.g., ) to denote the binary vector representing the subset of potential synapses on a segment (i.e., permanence value above 0).
We use a tilde (e.g., ) to denote the binary vector representing the subset of connected synapses (i.e., permanence value above connection threshold).

Initialization
Each dendritic segment is initialized to contain a random set of potential synapses.
Dijd, in is initialized to contain a random set of potential synapses chosen from the location input.
Segments in Did, outare initialized to contain a random set of potential synapses to other output cells.
These can include cells from the same cortical column.
We enforce the constraint that a given segment only contains synapses from a single column.
In all cases the permanence values of potential synapses are chosen randomly: initially some are connected (above threshold) and some are unconnected.

Computing cell states
A cell in the input layer is predicted if any of its basal distal segments have sufficient activity:
(1)
where is the activation threshold of the basal distal dendrite of an input cell.

600 :yamaguti:2018/09/18(火) 22:52:37.71 ID:F/b4+koTS ?2BP(3)
For the input layer, all the cells in a mini-column share the same feedforward receptive fields.
Following (Hawkins and Ahmad, 2016) we assume that an inhibitory process selects a set of s mini-columns that best match the current feedforward input pattern.
We denote this winner set as Win.
The set of active input layer cells is calculated as follows:
(2)
The first conditional states that predicted cells in a winning mini-column becoming winners and become active.
If no cell in a mini-column is predicted, all cells in that mini-column become active (second conditional).

To determine activity in the output layer we calculate the feedforward and lateral input to each cell.
Cells with enough feedforward overlap with the input layer, and the most lateral support from the previous time step become active.
The feedforward overlap to the kth output cell is:
(3)
The set of output cells with enough feedforward input is computed as:
(4)
where is a threshold.
We then select the active cells using the number of active basal segments as a sorting function:
(5)
where represents the number of active basal segments in the previous time step, and the sth highest number of active basal segments is denoted as .
is the activation threshold of the basal distal dendrite of an output cell.
I[] is the indicator function, and s is the minimum desired number of active neurons.
If the number of cells with lateral support is less than s in a cortical column, would be zero and all cells with enough feedforward input will become active.
Note that we used a modified version of the original HTM neuron model in the output layer by considering the effect of multiple active basal segments.

601 :yamaguti:2018/09/18(火) 22:55:38.21 ID:F/b4+koTS ?2BP(3)
Learning in the input layer
In the input layer, basal segments represent predictions.
At any point only segments that match its contextual input are modified.
If a cell was predicted (Equation 1) and becomes active, the corresponding basal segments are selected for learning.
If no cell in an active mini-column was predicted, we select a winning cell as the cell with the best basal input match via random initial conditions.

For selected segments, we decrease the permanence of inactive synapses by a small value p− and increase the permanence of active synapses by a larger value p+
(6)
where ∘ represents element-wise multiplication.
Incorrect predictions are negatively punished.
If a basal dendritic segment on a cell becomes active and the cell subsequently does not become active, we slightly decrement the permanences of active synapses on the corresponding segments.
Note that in Equation (6), learning is applied to all potential synapses (denoted by ).

Learning in the output layer
When learning a new object a sparse set of cells in the output layer is selected to represent the new object.
These cells remain active while the system senses the object at different locations.
Thus, each output cell pools over multiple feature/location representations in the input layer.

For each sensation, proximal synapses are learned by increasing the permanence of active synapses by , and decreasing the permanence of inactive synapses by :
(7)
Basal segments of active output cells are learned using a rule similar to Equation (7):
(8)
Feedback

602 :yamaguti:2018/09/18(火) 22:57:08.46 ID:F/b4+koTS ?2BP(3)
Feedback from the output layer to the input layer is used as an additional modulatory input to fine tune which cells in a winning mini-column become active.
Cells in the input layer maintain a set of apical segments similar to the set of basal segments.
If a cell has apical support (i.e., an active apical segment), we use a slightly lower value of to calculate .
In addition if multiple cells in a mini-column are predicted, only cells with feedback become active.
These rules make the set of active cells more precise with respect to the current representation in the output layer.
Apical segments on winning cells in the input layer are learned using exactly the same rules as basal segments.

Simulation details
To generate our convergence and capacity results we generated a large number of objects.
Each object consists of a number of sensory features, with each feature assigned to a corresponding location.
We encode each location as a 2,400-dimensional sparse binary vector with 10 random bits active.
Each sensory feature is similarly encoded by a vector with 10 random bits active.
The length of the sensory feature vector is the same as the number of mini-columns of the input layer Nin.
The input layer contains 150 mini-columns and 16 cells per mini-column, with 10 mini-columns active at any time.
The activation threshold of basal distal dendrite of input neuron is 6.
The output layer contains 4,096 cells and the minimum number of active output cells is 40.
The activation threshold is 3 for proximal dendrites and 18 for basal dendrites for output neurons.

603 :yamaguti:2018/09/18(火) 23:02:11.17 ID:F/b4+koTS ?2BP(3)
During training, the network learns each object in random order.
For each object, the network senses each feature three times.
The activation pattern in the output layer is saved for each object to calculate retrieval accuracy.
During testing, we allow the network to sense each object at Klocations.
After each sensation, we classify the activity pattern in the output layer.
We say that an object is correctly classified
if, for each cortical column, the overlap between the output layer and the stored representation for the correct object is above a threshold, and the overlaps with the stored representation for all other objects are below that threshold.
We use a threshold of 30.

For the network convergence experiment (Figures4,5), each object consists of 10 sensory features chosen from a library of 5 to 30 possible features.
The number of sensations during testing is 20.
For the capacity experiment, each object consists of 10 sensory features chosen from a large library of 5,000 possible features.
The number of sensations during testing is 3.

Finally, we make some simplifying assumptions that greatly speed up simulation time for larger networks.
Instead of explicitly initializing a complete set of synapses across every segment and every cell, we greedily create segments on a random cell and initialize potential synapses on that segment by sampling from currently active cells.
This happens only when there is no match to any existing segment.

For the noise robustness experiment (Figure6) we added random noise to the sensory input and the location input.
For each input, we randomly flip a fraction of the active input bits to inactive, and flip the corresponding number of inactive input bits to active.
This procedure randomizes inputs while maintaining constant input sparsity.
The noise level denotes the fraction of active input bits that are changed for each input.
We varied the amount of noise between 0 and 0.7.

604 :yamaguti:2018/09/18(火) 23:04:10.79 ID:F/b4+koTS ?2BP(3)
We constructed an ideal observer model to estimate the theoretical upper limit for model performance (Figure4C, Supplementary Figure 9).
During learning, the ideal observer model memorizes a list of (feature, location) pairs for each object.
During inference, the ideal observer model stores the sequence of observed (feature, location) pairs and calculates the overlap between all the observed pairs and the memorized list of pairs for each object.
The predicted object is the object that has the most overlap with all the observed sensations.
To compare the ideal observer with a multi-column network with N columns, we provide it with N randomly chosen observations per sensation.
Performance of the ideal observer model represents the best one can do given all the sensations up to the current time.
We also used the same framework to create a model that only uses sensory features, but no location signals (used in Figure ​Figure4C4C).

Go to:
Author contributions
JH conceived of the overall theory and the detailed mapping to neuroscience, helped design the simulations, and wrote most of the paper.
SA and YC designed and implemented the simulations and created the mathematical formulation of the algorithm.

Conflict of interest statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
JH, SA, and YC were employed by Numenta Inc.
Numenta has some patents relevant to the work.
Numenta has stated that use of its intellectual property, including all the ideas contained in this work, is free for non-commercial research purposes.
In addition Numenta has released all pertinent source code as open source under a GPL V3 license (which includes a patent peace provision).

Go to:
Acknowledgments

605 :yamaguti:2018/09/18(火) 23:06:29.77 ID:F/b4+koTS ?2BP(3)
We thank the reviewers for their detailed comments, which have helped to improve the paper significantly.
We thank Jeff Gavornik for his thoughtful comments and suggestions.
We also thank Marcus Lewis, Nathanael Romano, and numerous other collaborators at Numenta over the years for many discussions.

Go to:
Footnotes
Funding.
Numenta is a privately held company.
Its funding sources are independent investors and venture capitalists.

Go to:
Supplementary material
The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fncir.2017.00081/full#supplementary-material
Click here for additional data file.(7.0M, MP4)
Click here for additional data file.(2.8M, PDF)
Go to:
References

606 :yamaguti:2018/09/20(木) 00:00:59.64 ID:94opR8z06 ?2BP(3)
* Ahmad S., Hawkins J. (2016).
How do neurons operate on sparse distributed representations? A mathematical theory of sparsity, neurons and active dendrites.
arXiv:1601.00720 [q-NC].
* Ahveninen J., Jääskeläinen I. P., Raij T., Bonmassar G., Devore S., Hämäläinen M., et al. (2006).
Task-modulated “what” and “where” pathways in human auditory cortex.
Proc. Natl. Acad. Sci. U.S.A. 103, 14608–14613. 10.1073/pnas.0510480103 [PMC free article] [PubMed] [Cross Ref]
* Andersen R. A., Snyder L. H., Li C. S., Stricanne B.
(1993). Coordinate transformations in the representation of spatial information.
Curr. Opin. Neurobiol. 3, 171–176. 10.1016/0959-4388(93)90206-E [PubMed] [Cross Ref]
* Bastos A. M., Usrey W. M., Adams R. A., Mangun G. R., Fries P., Friston K. J. (2012).
Canonical microcircuits for predictive coding.
Neuron 76, 695–711. 10.1016/j.neuron.2012.10.038 [PMC free article] [PubMed] [Cross Ref]
* Billaudelle S., Ahmad S. (2015).
Porting htm models to the heidelberg neuromorphic computing platform.
arXiv:1505.02142 [q-NC].
* Binzegger T., Douglas R. J., Martin K. A. C. (2004).
A quantitative map of the circuit of cat primary visual cortex.
J. Neurosci. 24, 8441–8453. 10.1523/JNEUROSCI.1400-04.2004 [PubMed] [Cross Ref]
* Bolognini N., Maravita A. (2007). Proprioceptive alignment of visual and somatosensory maps in the posterior parietal cortex.
Curr. Biol. 17, 1890–1895. 10.1016/j.cub.2007.09.057 [PubMed] [Cross Ref]
* Bremmer F. (2000).
Eye position effects in macaque area V4.
Neuroreport 11, 1277–1283. 10.1097/00001756-200004270-00027 [PubMed] [Cross Ref]
* Brotchie P. R., Andersen R. A., Snyder L. H., Goodman S. J. (1995).
Head position signals used by parietal neurons to encode locations of visual stimuli.
Nature 375, 232–235. 10.1038/375232a0 [PubMed] [Cross Ref]

607 :yamaguti:2018/09/20(木) 00:02:06.44 ID:94opR8z06 ?2BP(3)
* Brotchie P. R., Lee M. B., Chen D. Y., Lourensz M., Jackson G., Bradley W. G. (2003).
Head position modulates activity in the human parietal eye fields.
Neuroimage 18, 178–184. 10.1006/nimg.2002.1294 [PubMed] [Cross Ref]
* Bureau I., Shepherd G. M. G., Svoboda K. (2004).
Precise development of functional and anatomical columns in the neocortex.
Neuron 42, 789–801. 10.1016/j.neuron.2004.05.002 [PubMed] [Cross Ref]
* Buxhoeveden D. P. (2002).
The minicolumn hypothesis in neuroscience.
Brain 125, 935–951. 10.1093/brain/awf110 [PubMed] [Cross Ref]
* Chapin J. K. (1986).
Laminar differences in sizes, shapes, and response profiles of cutaneous receptive fields in the rat SI cortex.
Exp. Brain Res. 62, 549–559. 10.1007/BF00236033 [PubMed] [Cross Ref]
* Chklovskii D. B., Mel B. W., Svoboda K. (2004).
Cortical rewiring and information storage.
Nature 431, 782–788. 10.1038/nature03012 [PubMed] [Cross Ref]
* Constantinople C. M., Bruno R. M. (2013).
Deep cortical layers are activated directly by thalamus.
Science 340, 1591–1594. 10.1126/science.1236425 [PMC free article] [PubMed] [Cross Ref]
* Craft E., Schutze H., Niebur E., von der Heydt R. (2007).
A neural model of figure-ground organization.
J. Neurophysiol. 97, 4310–4326. 10.1152/jn.00203.2007 [PubMed] [Cross Ref]
* DeSouza J. F., Dukelow S. P., Vilis T. (2002).
Eye position signals modulate early dorsal and ventral visual areas.
Cereb. Cortex 12, 991–997. 10.1093/cercor/12.9.991 [PubMed] [Cross Ref]
* Douglas R. J., Martin K. A. (1991).
A functional microcircuit for cat visual cortex.
J. Physiol. 440, 735–769. 10.1113/jphysiol.1991.sp018733 [PMC free article] [PubMed] [Cross Ref]
* Douglas R. J., Martin K. A. (2004).
Neuronal circuits of the neocortex.
Annu. Rev. Neurosci. 27, 419–451. 10.1146/annurev.neuro.27.070203.144152 [PubMed] [Cross Ref]

608 :yamaguti:2018/09/20(木) 00:05:57.40 ID:94opR8z06 ?2BP(3)
* Duhamel J., Colby C. L., Goldberg M. E. (1992).
The updating of the representation of visual representation visual space in parietal cortex by intended eye movements.
Science 255, 90–92. 10.1126/science.1553535 [PubMed] [Cross Ref]
* Feldmeyer D., Brecht M., Helmchen F., Petersen C. C. H., Poulet J. F. A., Staiger J. F., et al. . (2013).
Barrel cortex function.
Prog. Neurobiol. 103, 3–27. 10.1016/j.pneurobio.2012.11.002 [PubMed] [Cross Ref]
* Feldmeyer D., Lübke J., Silver R. A., Sakmann B. (2002).
Synaptic connections between layer 4 spiny neurone-layer 2/3 pyramidal cell pairs in juvenile rat barrel cortex: physiology and anatomy of interlaminar signalling within a cortical column.
J. Physiol. 538, 803–822. 10.1113/jphysiol.2001.012959 [PMC free article] [PubMed] [Cross Ref]
* Gavornik J. P., Bear M. F. (2014).
Learned spatiotemporal sequence recognition and prediction in primary visual cortex. Nat.
Neurosci. 17, 732–737. 10.1038/nn.3683 [PMC free article] [PubMed] [Cross Ref]
* Gilbert C. D. (1977).
Laminar differences in receptive field properties of cells in cat primary visual cortex.
J. Physiol. 268, 391–421. 10.1113/jphysiol.1977.sp011863 [PMC free article] [PubMed] [Cross Ref]
* Goodale M. A., Milner A. D. (1992).
Separate visual pathways for perception and action.
Trends Neurosci. 15, 20–25. 10.1016/0166-2236(92)90344-8 [PubMed] [Cross Ref]
* Graziano M. S., Gross C. G. (1998).
Spatial maps for the control of movement.
Curr. Opin. Neurobiol. 8, 195–201. 10.1016/S0959-4388(98)80140-2 [PubMed] [Cross Ref]
* Graziano M. S., Hu X. T., Gross C. G. (1997).
Visuospatial properties of ventral premotor cortex.
J. Neurophysiol. 77, 2268–2292. [PubMed]
* Guillery R. W., Sherman S. M. (2011).
Branched thalamic afferents: what are the messages that they relay to the cortex?
Brain Res. Rev. 66, 205–219. 10.1016/j.brainresrev.2010.08.001 [PMC free article] [PubMed] [Cross Ref]

609 :yamaguti:2018/09/20(木) 00:07:28.88 ID:94opR8z06 ?2BP(3)
* Guy J., Staiger J. F. (2017).
The functioning of a cortex without layers.
Front. Neuroanat. 11:54. 10.3389/fnana.2017.00054 [PMC free article] [PubMed] [Cross Ref]
* Haeusler S., Maass W. (2007).
A statistical analysis of information-processing properties of lamina-specific cortical microcircuit models.
Cereb. Cortex 17, 149–162. 10.1093/cercor/bhj132 [PubMed] [Cross Ref]
* Hafting T., Fyhn M., Molden S., Moser M.-B., Moser E. I. (2005).
Microstructure of a spatial map in the entorhinal cortex.
Nature 436, 801–806. 10.1038/nature03721 [PubMed] [Cross Ref]
* Hawkins J., Ahmad S. (2016).
Why neurons have thousands of synapses, a theory of sequence memory in neocortex.
Front. Neural Circuits 10:23. 10.3389/fncir.2016.00023 [PMC free article] [PubMed] [Cross Ref]
* Helmstaedter M., Sakmann B., Feldmeyer D. (2009).
Neuronal correlates of local, lateral, and translaminar inhibition with reference to cortical columns.
Cereb. Cortex 19, 926–937. 10.1093/cercor/bhn141 [PubMed] [Cross Ref]
* Hill S., Tononi G. (2004).
Modeling sleep and wakefulness in the thalamocortical system.
J. Neurophysiol. 93, 1671–1698. 10.1152/jn.00915.2004 [PubMed] [Cross Ref]
* Horton J. C., Adams D. L. (2005).
The cortical column: a structure without a function.
Philos. Trans. R. Soc. Lond. B Biol. Sci. 360, 837–862. 10.1098/rstb.2005.1623 [PMC free article] [PubMed] [Cross Ref]
* Hubel D., Wiesel T. N. (1962).
Receptive fields, binocular interaction and functional architecture in the cat's visual cortex.
J. Physiol. 160, 106–154. 10.1113/jphysiol.1962.sp006837 [PMC free article] [PubMed] [Cross Ref]
* Hunt J. J., Bosking W. H., Goodhill G. J. (2011).
Statistical structure of lateral connections in the primary visual cortex.
Neural Syst. Circuits 1:3. 10.1186/2042-1001-1-3 [PMC free article] [PubMed] [Cross Ref]

610 :yamaguti:2018/09/20(木) 00:08:48.18 ID:94opR8z06 ?2BP(3)
* Iyer R., Mihalas S. (2017).
Cortical circuits implement optimal context integration.
bioRxiv. 10.1101/158360 [Cross Ref]
* Jones E. G. (2000).
Microcolumns in the cerebral cortex.
Proc. Natl. Acad. Sci. U.S.A. 97, 5019–5021. 10.1073/pnas.97.10.5019 [PMC free article] [PubMed] [Cross Ref]
* Kakei S., Hoffman D. S., Strick P. L. (2003).
Sensorimotor transformations in cortical motor areas.
Neurosci. Res. 46, 1–10. 10.1016/S0168-0102(03)00031-2 [PubMed] [Cross Ref]
* Kim E. J., Juavinett A. L., Kyubwa E. M., Jacobs M. W., Callaway E. M. (2015).
Three types of cortical layer 5 neurons that differ in brain-wide connectivity and function.
Neuron 88, 1253–1267. 10.1016/j.neuron.2015.11.002 [PMC free article] [PubMed] [Cross Ref]
* Kim J., Matney C. J., Blankenship A., Hestrin S., Brown S. P. (2014).
Layer 6 corticothalamic neurons activate a cortical output layer, layer 5a.
J. Neurosci. 34, 9656–9664. 10.1523/JNEUROSCI.1325-14.2014 [PMC free article] [PubMed] [Cross Ref]
* Kropff E., Carmichael J. E., Moser M.-B., Moser E. I. (2015).
Speed cells in the medial entorhinal cortex.
Nature 523, 419–424. 10.1038/nature14622 [PubMed] [Cross Ref]
* LeCun Y., Bengio Y., Hinton G. (2015).
Deep learning.
Nature 521, 436–444. 10.1038/nature14539 [PubMed] [Cross Ref]
* Lee S., Carvell G. E., Simons D. J. (2008).
Motor modulation of afferent somatosensory circuits.
Nat. Neurosci. 11, 1430–1438. 10.1038/nn.2227 [PMC free article] [PubMed] [Cross Ref]
* Lefort S., Tomm C., Floyd Sarria J.-C., Petersen C. C. H. (2009).
The excitatory neuronal network of the C2 barrel column in mouse primary somatosensory cortex.
Neuron 61, 301–316. 10.1016/j.neuron.2008.12.020 [PubMed] [Cross Ref]

611 :yamaguti:2018/09/20(木) 00:10:34.84 ID:94opR8z06 ?2BP(3)
* Lemmon V., Pearlman A. L. (1981).
Does laminar position determine the receptive field properties of cortical neurons? a study of corticotectal cells in area 17 of the normal mouse and the reeler mutant. J.
Neurosci. 1, 83–93. [PubMed]
* Li N., DiCarlo J. J. (2008).
Unsupervised natural experience rapidly alters invariant object representation in visual cortex.
Science 321, 1502–1507. 10.1126/science.1160028 [PMC free article] [PubMed] [Cross Ref]
* Lohmann H., Rörig B. (1994).
Long-range horizontal connections between supragranular pyramidal cells in the extrastriate visual cortex of the rat.
J. Comp. Neurol. 344, 543–558. 10.1002/cne.903440405 [PubMed] [Cross Ref]
* Losonczy A., Makara J. K., Magee J. C. (2008).
Compartmentalized dendritic plasticity and input feature storage in neurons.
Nature 452, 436–441. 10.1038/nature06725 [PubMed] [Cross Ref]
* Lubke J., Egger V., Sakmann B., Feldmeyer D. (2000).
Columnar organization of dendrites and axons of single and synaptically coupled excitatory spiny neurons in layer 4 of the rat barrel cortex.
J. Neurosci. 20, 5300–5311. [PubMed]
* Luhmann H. J., Singer W., Martínez-Millán L. (1990).
Horizontal interactions in cat striate cortex: I. Anatomical substrate and postnatal development.
Eur. J. Neurosci. 2, 344–357. 10.1111/j.1460-9568.1990.tb00426.x [PubMed] [Cross Ref]
* Maass W. (1997).
Networks of spiking neurons: the third generation of neural network models.
Neural Netw. 10, 1659–1671. 10.1016/S0893-6080(97)00011-7 [Cross Ref]
* Mangini N. J., Pearlman A. L. (1980).
Laminar distribution of receptive field properties in the primary visual cortex of the mouse.
J. Comp. Neurol. 193, 203–222. 10.1002/cne.901930114 [PubMed] [Cross Ref]

612 :yamaguti:2018/09/20(木) 00:12:24.26 ID:94opR8z06 ?2BP(3)
* Markov N. T., Ercsey-Ravasz M., Van Essen D. C., Knoblauch K., Toroczkai Z., Kennedy H. (2013).
Cortical high-density counterstream architectures.
Science 342:1238406. 10.1126/science.1238406 [PMC free article] [PubMed] [Cross Ref]
* Markram H., Muller E., Ramaswamy S., Reimann M. W., Abdellah M., Sanchez C. A., et al. . (2015).
Reconstruction and simulation of neocortical microcircuitry.
Cell 163, 456–492. 10.1016/j.cell.2015.09.029 [PubMed] [Cross Ref]
* Martin K. A. C., Schröder S. (2013).
Functional heterogeneity in neighboring neurons of cat primary visual cortex in response to both artificial and natural stimuli.
J. Neurosci. 33, 7325–7344. 10.1523/JNEUROSCI.4071-12.2013 [PubMed] [Cross Ref]
* McGuire B. A., Hornung J. P., Gilbert C. D., Wiesel T. N. (1984).
Patterns of synaptic input to layer 4 of cat striate cortex.
J. Neurosci. 4, 3021–3033. [PubMed]
* Meyer H. S., Schwarz D., Wimmer V. C., Schmitt A. C., Kerr J. N. D., Sakmann B., et al. . (2011).
Inhibitory interneurons in a cortical column form hot zones of inhibition in layers 2 and 5A.
Proc. Natl. Acad. Sci. U.S.A. 108, 16807–16812. 10.1073/pnas.1113648108 [PMC free article] [PubMed] [Cross Ref]
* Moser E. I., Kropff E., Moser M.-B. (2008).
Place cells, grid cells, and the brain's spatial representation system.
Annu. Rev. Neurosci. 31, 69–89. 10.1146/annurev.neuro.31.061307.090723 [PubMed] [Cross Ref]
* Moser E. I., Roudi Y., Witter M. P., Kentros C., Bonhoeffer T., Moser M.-B. (2014).
Grid cells and cortical representation.
Nat. Rev. Neurosci. 15, 466–481. 10.1038/nrn3766 [PubMed] [Cross Ref]
* Moser M.-B., Rowland D. C., Moser E. I. (2015).
Place cells, grid cells, and memory.
Cold Spring Harb. Perspect. Biol. 7:a021808. 10.1101/cshperspect.a021808 [PMC free article] [PubMed] [Cross Ref]

613 :yamaguti:2018/09/20(木) 00:15:44.06 ID:94opR8z06 ?2BP(3)
* Mountcastle V. (1978).
An organizing principle for cerebral function: the unit model and the distributed system, in The Mindful Brain, eds Edelman G., Mountcastle V., editors.
(Cambridge, MA: MIT Press; ), 7–50.
* Mountcastle V. B. (1997).
The columnar organization of the neocortex.
Brain 120, 701–722. 10.1093/brain/120.4.701 [PubMed] [Cross Ref]
* Movshon J. A., Thompson I. D., Tolhurst D. J. (1978).
Receptive field organization of complex cells in the cat's striate cortex.
J. Physiol. 283, 79–99. 10.1113/jphysiol.1978.sp012489 [PMC free article] [PubMed] [Cross Ref]
* Nakamura K., Colby C. L. (2002).
Updating of the visual representation in monkey striate and extrastriate cortex during saccades.
Proc. Natl. Acad. Sci. U.S.A. 99, 4026–4031. 10.1073/pnas.052379899 [PMC free article] [PubMed] [Cross Ref]
* Pouget A., Snyder L. H. (2000).
Computational approaches to sensorimotor transformations.
Nat. Neurosci. 3, 1192–1198. 10.1038/81469 [PubMed] [Cross Ref]
* Raizada R. D. S., Grossberg S. (2003).
Towards a theory of the laminar architecture of cerebral cortex: computational clues from the visual system.
Cereb. Cortex 13, 100–113. 10.1093/cercor/13.1.100 [PubMed] [Cross Ref]
* Ramaswamy S., Markram H. (2015).
Anatomy and physiology of the thick-tufted layer 5 pyramidal neuron.
Front. Cell. Neurosci. 9:233. 10.3389/fncel.2015.00233 [PMC free article] [PubMed] [Cross Ref]
* Reimann M. W., Anastassiou C. A., Perin R., Hill S. L., Markram H., Koch C. (2013).
A biophysically detailed model of neocortical local field potentials predicts the critical role of active membrane currents.
Neuron 79, 375–390. 10.1016/j.neuron.2013.05.023 [PMC free article] [PubMed] [Cross Ref]

614 :yamaguti:2018/09/20(木) 00:16:50.75 ID:94opR8z06 ?2BP(3)
* Reyes-Puerta V., Kim S., Sun J.-J., Imbrosci B., Kilb W., Luhmann H. J. (2015a).
High stimulus-related information in barrel cortex inhibitory interneurons.
PLoS Comput. Biol. 11:e1004121. 10.1371/journal.pcbi.1004121 [PMC free article] [PubMed] [Cross Ref]
* Reyes-Puerta V., Sun J.-J., Kim S., Kilb W., Luhmann H. J. (2015b).
Laminar and columnar structure of sensory-evoked multineuronal spike sequences in adult rat barrel cortex in vivo.
Cereb. Cortex 25, 2001–2021. 10.1093/cercor/bhu007 [PubMed] [Cross Ref]
* Rizzolatti G., Cattaneo L., Fabbri-Destro M., Rozzi S. (2014).
Cortical mechanisms underlying the organization of goal-directed actions and mirror neuron-based action understanding.
Physiol. Rev. 94, 655–706. 10.1152/physrev.00009.2013 [PubMed] [Cross Ref]
* Rockland K. S. (2010).
Five points on columns. Front.
Neuroanat. 4:22. 10.3389/fnana.2010.00022 [PMC free article] [PubMed] [Cross Ref]
* Russo G. S., Bruce C. J. (1994).
Frontal eye field activity preceding aurally guided saccades.
J. Neurophysiol. 71, 1250–1253. [PubMed]
* Rust N. C., DiCarlo J. J. (2010).
Selectivity and tolerance (“invariance”) both increase as visual information propagates from cortical area V4 to IT.
J. Neurosci. 30, 12978–12995. 10.1523/JNEUROSCI.0179-10.2010 [PMC free article] [PubMed] [Cross Ref]
* Sarid L., Bruno R., Sakmann B., Segev I., Feldmeyer D. (2007).
Modeling a layer 4-to-layer 2/3 module of a single column in rat neocortex: interweaving in vitro and in vivo experimental observations.
Proc. Natl. Acad. Sci. U.S.A. 104, 16353–16358. 10.1073/pnas.0707853104 [PMC free article] [PubMed] [Cross Ref]
* Schnepel P., Kumar A., Zohar M., Aertsen A., Boucsein C. (2015).
Physiology and impact of horizontal connections in rat neocortex.
Cereb. Cortex 25, 3818–3835. 10.1093/cercor/bhu265 [PubMed] [Cross Ref]

615 :yamaguti:2018/09/20(木) 00:18:30.52 ID:94opR8z06 ?2BP(3)
* Sherman S. M., Guillery R. W. (2011). Distinct functions for direct and transthalamic corticocortical connections. J. Neurophysiol. 106, 1068–1077. 10.1152/jn.00429.2011 [PubMed] [Cross Ref]
* Shipp S. (2007).
Structure and function of the cerebral cortex.
Curr. Biol. 17, R443–R449. 10.1016/j.cub.2007.03.044 [PubMed] [Cross Ref]
* Spruston N. (2008).
Pyramidal neurons: dendritic structure and synaptic integration.
Nat. Rev. Neurosci. 9, 206–221. 10.1038/nrn2286 [PubMed] [Cross Ref]
* St. John-Saaltink E., Kok P., Lau H. C., de Lange F. P. (2016).
Serial dependence in perceptual decisions is reflected in activity patterns in primary visual cortex.
J. Neurosci. 36, 6186–6192. 10.1523/JNEUROSCI.4390-15.2016 [PubMed] [Cross Ref]
* Stettler D. D., Das A., Bennett J., Gilbert C. D. (2002).
Lateral connectivity and contextual interactions in macaque primary visual cortex.
Neuron 36, 739–750. 10.1016/S0896-6273(02)01029-2 [PubMed] [Cross Ref]
* Stuart G. J., Häusser M. (2001).
Dendritic coincidence detection of EPSPs and action potentials.
Nat. Neurosci. 4, 63–71. 10.1038/82910 [PubMed] [Cross Ref]
* Thomson A. M. (2010). Neocortical layer 6, a review.
Front. Neuroanat.
4:13. 10.3389/fnana.2010.00013 [PMC free article] [PubMed] [Cross Ref]
* Thomson A. M., Bannister A. P. (2003).
Interlaminar connections in the neocortex.
Cereb. Cortex 13, 5–14. 10.1093/cercor/13.1.5 [PubMed] [Cross Ref]
* Thomson A. M., Lamy C. (2007).
Functional maps of neocortical local circuitry.
Front. Neurosci. 1, 19–42. 10.3389/neuro.01.1.1.002.2007 [PMC free article] [PubMed] [Cross Ref]
* Traub R. D., Contreras D., Cunningham M. O., Murray H., LeBeau F. E. N., Roopun A., et al. . (2004).
Single-column thalamocortical network model exhibiting gamma oscillations, sleep spindles, and epileptogenic bursts.
J. Neurophysiol. 93, 2194–2232. 10.1152/jn.00983.2004 [PubMed] [Cross Ref]

616 :yamaguti:2018/09/20(木) 00:19:11.66 ID:94opR8z06 ?2BP(3)
* Trotter Y., Celebrini S. (1999).
Gaze direction controls response gain in primary visual-cortex neurons.
Nature 398, 239–242. 10.1038/18444 [PubMed] [Cross Ref]
* Ungerleider L. G., Haxby J. V. (1994).
“What” and “where” in the human brain.
Curr. Opin. Neurobiol. 4, 157–165. 10.1016/0959-4388(94)90066-3 [PubMed] [Cross Ref]
* Viaene A. N., Petrof I., Sherman S. M. (2011a).
Synaptic properties of thalamic input to layers 2/3 and 4 of primary somatosensory and auditory cortices.
J. Neurophysiol. 105, 279–292. 10.1152/jn.00747.2010 [PMC free article] [PubMed] [Cross Ref]
* Viaene A. N., Petrof I., Sherman S. M. (2011b).
Synaptic properties of thalamic input to the subgranular layers of primary somatosensory and auditory cortices in the mouse.
J. Neurosci. 31, 12738–12747. 10.1523/JNEUROSCI.1565-11.2011 [PMC free article] [PubMed] [Cross Ref]
* Vinje W. E., Gallant J. L. (2002).
Natural stimulation of the nonclassical receptive field increases information transmission efficiency in V1.
J. Neurosci. 22, 2904–2915. [PubMed]
* Werner-Reiss U., Kelly K. A., Trause A. S., Underhill A. M., Groh J. M. (2003).
Eye position affects activity in primary auditory cortex of primates.
Curr. Biol. 13, 554–562. 10.1016/S0960-9822(03)00168-4 [PubMed] [Cross Ref]
* Yen S.-C., Baker J., Gray C. M. (2006).
Heterogeneity in the responses of adjacent neurons to natural stimuli in cat striate cortex.
J. Neurophysiol. 97, 1326–1341. 10.1152/jn.00747.2006 [PubMed] [Cross Ref]
* Zhou H., Friedman H. S., von der Heydt R. (2000).
Coding of border ownership in monkey visual cortex.
J. Neurosci. 20, 6594–6611. [PMC free article] [PubMed]
* Zipser D., Andersen R. A. (1988).
A back-propagation programmed network that simulates response properties of a subset of posterior parietal neurons.
Nature 331, 679–684. 10.1038/331679a0 [PubMed] [Cross Ref]

617 :yamaguti:2018/09/25(火) 17:53:29.33 ID:b91A4MXGL ?2BP(3)
>>575-616
Figure 1

(A) Our network model contains one or more laterally connected cortical columns (three shown).
Each column receives feedforward sensory input from a different sensor array
[e.g., different fingers or adjacent areas of the retina (not shown)].
The input layer combines sensory input with a modulatory location input to form sparse representations that correspond to features at specific locations on the object.
The output layer receives feedforward inputs from the input layer and converges to a stable pattern representing the object (e.g., a coffee cup).
Convergence in the second layer is achieved via two means.
One is by integration over time as the sensor moves relative to the object, and the other is via modulatory lateral connections between columns that are simultaneously sensing different locations on the same object (blue arrows in upper layer).
Feedback from the output layer to the input layer allows the input layer to predict what feature will be present after the next movement of the sensor.
(B) Pyramidal neurons have three synaptic integration zones, proximal (green), basal (blue), and apical (purple).
Although individual synaptic inputs onto basal and apical dendrites have a small impact on the soma, co-activation of a small number of synapses on a dendritic segment can trigger a dendritic spike (top right).
The HTM neuron model incorporates active dendrites and multiple synaptic integration zones (bottom).
Patterns recognized on proximal dendrites generate action potentials.
Patterns recognized on the basal and apical dendrites depolarize the soma without generating an action potential.
Depolarization is a predictive state of the neuron.
Our network model relies on these properties and our simulations use HTM neurons.
A detailed walkthrough of the algorithm can be found in the Supplementary Video.

618 :yamaguti:2018/09/25(火) 17:54:32.55 ID:b91A4MXGL ?2BP(3)
>>575--616
Figure 2

Cellular activations in the input and output layers of a single column during a sequence of touches on an object.
(A) Two objects (cube and wedge).
For each object, three feature-location pairs are shown (f1 and f3 are common to both the cube and wedge).
The output layer representations associated with each object, and the sensory representations for each feature are shown.
(B) Cellular activations in both layers caused by a sequence of three touches on the cube (in time order from top to bottom).
The first touch (at f1) results in a set of active cells in the input layer (black dots in input layer) corresponding to that feature-location pair.
Cells in the output layer receive this representation through their feed-forward connections (black arrow).
Since the input is ambiguous, the output layer forms a representation that is the union of both the cube and the wedge (black dots in output layer).
Feedback from the output layer to the input layer (red arrow) causes all feature-location pairs associated with both potential objects to become predicted (red dots).
The second touch (at f2) results in a new set of active cells in the input layer.
Since f2 is not shared with the wedge, the representation in the output layer is reduced to only the cube.
The set of predicted cells in the input layer is also reduced to the feature-location pairs of the cube.
The third touch (at f3) would be ambiguous on its own, however, due to the past sequence of touches and self-reinforcement in the output layer, the representation of the object in the output layer remains unique to the cube.
Note the number of cells shown is unrealistically small for illustration clarity.

619 :yamaguti:2018/09/25(火) 17:55:49.47 ID:b91A4MXGL ?2BP(3)
>>575--616
Figure 3

(A) The output layer represents each object by a sparse pattern.
We tested the network on the first object.
(B) Activity in the output layer of a single column network as it touches the object.
The network converges after 11 sensations (red rectangle).
(C) Activity in the output layer of a three column network as it touches the object.
The network converges much faster, after four sensations (red rectangle).
In both (B,C) the representation in Column 1 is the same as the target object representation after convergence.


Figure 4

(A) Mean number of sensations needed to unambiguously recognize an object with a single column network as the set of learned objects increases.
We train models on varying numbers of objects, from 1 to 100 and plot the average number of sensations required to unambiguously recognize a single object.
The different curves show how convergence varies with the total number of unique features from which objects are constructed.
In all cases the network eventually recognizes the object.
Recognition requires fewer sensations when the set of features is greater.
(B) Mean number of observations needed to unambiguously recognize an object with multi-column networks as the set of columns increases.
We train each network with 100 objects and plot the average number of sensations required to unambiguously recognize an object.
The required number of sensations rapidly decreases as the number of columns increases, eventually reaching one.
(C) Fraction of objects that can be unambiguously recognized as a function of number of sensations for an ideal observer model with location (blue), without location (orange) and our one-column sensorimotor network (green).

620 :yamaguti:2018/09/25(火) 17:56:59.97 ID:b91A4MXGL ?2BP(3)
>>576
Figure 5

Recognition accuracy is plotted as a function of the number of learned objects.
(A) Network capacity relative to number of mini-columns in the input layer.
The number of output cells is kept at 4,096 with 40 cells active at any time.
(B) Network capacity relative to number of cells in the output layer.
The number of active output cells is kept at 40.
The number of mini-columns in the input layer is 150.
(C) Network capacity for one, two, and three cortical columns (CCs).
The number of mini-columns in the input layer is 150, and the number of output cells is 4,096.


>>577
Figure 6

Robustness of a single column network to noise.
(A) Recognition accuracy is plotted as a function of the amount of noise in the sensory input (blue) and in the location input (yellow).
(B) Recognition accuracy as a function of the number of sensations.
Colored lines correspond to noise levels in the location input.

621 :yamaguti:2018/09/25(火) 17:58:06.40 ID:b91A4MXGL ?2BP(3)
>>579
Figure 7

Mapping of sensorimotor inference network onto experimentally observed cortical connections.
Arrows represent documented pathways.
(A) First instance of network; L4 is input layer, L2/3 is output layer.
Green arrows are feedforward pathway, from thalamo-cortical (TC) relay cells, to L4, to L2/3 cortico-cortical (CC) output cells.
Cells in L2/3 also project back to L4 and to adjacent columns (blue arrows); these projections depolarize specific sets of cells that act as predictions (see text).
Red arrow is location signal originating in L6a and terminating on basal distal dendrites of L4 cells.
(B) Possible second instance of network; L6a is input layer, L5 is output layer.
Both instances of the network receive feedforward input from the same TC axons, thus the two networks run in parallel (Constantinople and Bruno, 2013; Markov et al., 2013).
The origin and derivation of the location signal (LOC) is unknown but likely involves local processing as well as input from other regions (see text and Discussion).
The output of the upper network makes direct cortical-cortical (CC) connections, whereas the output of the lower network projects to thalamic relay cells before projecting to the next region.

622 :YAMAGUTIseisei:2018/10/14(日) 12:45:09.59 ID:BFWbyhVTH ?2BP(3)
>>596
His hypothesis remains controversial partly because we haven't been able to identify what functions, and partly because it has been hard to imagine what single complex function is applicable to all sensory and cognitive processes.

623 :YAMAGUTIseisei:2019/01/14(月) 00:33:12.87 ID:tjARBhyFs ?2BP(3)
http://arxiv.org/pdf/1505.02142/
arXiv:1505.02142v2 [q-bio.NC] 9 Feb 2016


Porting HTM Models to the Heidelberg Neuromorphic Computing Platform


Sebastian Billaudelle 1,*,
Subutai Ahmad 2,†

1 Kirchhoff-Institute for Physics, Heidelberg, Germany
2 Numenta, Inc., Redwood City, CA

* email: sebastian.billaudelleATkip.uni-heidelberg.d
† email: sahmadATnumenta


ABSTRACT
Hierarchical Temporal Memory (HTM) is a computational theory of machine intelligence based on a detailed study of the neocortex.
The Heidelberg Neuromorphic Computing Platform, developed as part of the Human Brain Project (HBP), is a mixed-signal (analog and digital) large-scale platform for modeling networks of spiking neurons.
In this paper we present the first effort in porting HTM networks to this platform.
We describe a framework for simulating key HTM operations using spiking network models.
We then describe specific spatial pooling and temporal memory implementations, as well as simulations demonstrating that the fundamental properties are maintained.
We discuss issues in implementing the full set of plasticity rules using SpikeTiming Dependent Plasticity (STDP), and rough place and route calculations.
Although further work is required, our initial studies indicate that it should be possible to run large-scale HTM networks (including plasticity rules) efficiently on the Heidelberg platform.
More generally the exercise of porting high level HTM algorithms to biophysical neuron models promises to be a fruitful area of investigation for future studies.

624 :YAMAGUTIseisei:2019/01/14(月) 00:36:18.25 ID:tjARBhyFs ?2BP(3)
1
INTRODUCTION

The mammalian brain, particularly that of humans, is able to process diverse sensory input, learn and recognize complex spatial and temporal patterns, and generate behaviour based on context and previous experiences.
While computers are efficient in carrying out numerical calculations, they fall short in solving cognitive tasks.
Studying the brain and the neocortex in particular is an important step to develop new algorithms closing the gap between intelligent organisms and artificial systems.
Numenta is a company dedicated to developing such algorithms and at the same time investigating the principles of the neocortex.
Their Hierarchical Temporal Memory (HTM) models are designed to solve real world problems based on neuroscience results and theories.

Efficiently simulating large-scale neural networks in software is still a challenge.
The more biophysical details a model features, the more computational ressources it requires.
Different techniques for speeding up the execution of such implementations exist, e.g. by parallelizing calculations.
Dedicated hardware platforms are also being developed.
Digital neuromorphic hardware like the SpiNNaker platform often features highly parallelized processing architectures and optimized signal routing [Furber et al., 2014].
On the other hand, analog systems directly emulate the neuron' s behavior in electronic microcircuits.
The Hybrid Multi-Scale Facility (HMF) is a mixed-signal platform developed in the scopes of the BrainScaleS Project (BSS) and Human Brain Project (HBP).

625 :yamaguti:2019/01/14(月) 00:38:37.49 ID:tjARBhyFs ?2BP(3)
In this paper we present efforts in porting HTM networks to the HMF.
A framework for simulating HTMs based on spiking neural networks is introduced, as well as concrete network models for the HTM concepts spatial pooling and the temporal memory.
We compare the behavior to software implementations in order to verify basic properties of the HTM networks.
We discuss the overall applicability of these models on the target platform, the impact of synaptic plasticity, and connection routing considerations.

1.1
Hierarchical Temporal Memory

HTM represents a set of concepts and algorithms for machine intelligence based on neocortical principles [Hawkins et al., 2011].
It is designed to learn spatial as well as temporal patterns and generate predictions from previously seen sequences.
It features continuous learning and operates on streaming data.
An HTM network consists of one or multiple hierarchically arranged regions.
The latter contain neurons organized in columns.
The functional principle is captured in two algorithms which are laid out in detail in the original whitepaper [Hawkins et al., 2011].
The following paragraphs are intended as an introductory overview and introduce the properties relevant to this work.

The spatial pooler is designed to map a binary input vector to a set of columns.
By recognizing previously seen input data, it increases stability and reduces the system' s susceptibility for noise.
Its behaviour can be characterized by the following properties:

1.
The columnar activity is sparse.
Typically, 40 out of 2,048 colums are active, which is approximately a sparsity of 2 %.
The number of active columns is constant in each time step and does not depend on the input sparsity.

626 :yamaguti:2019/01/14(月) 00:41:11.93 ID:tjARBhyFs ?2BP(3)
2.
The spatial pooler activates the k columns which receive the most input.
In case of a tie between two columns, the active column is selected randomly, e.g. through structural advantages of certain cells compared to its neighbors.


1


Billaudelle et al.


3.
Stimuli with low pairwise overlap counts are mapped to sparse columnar representations with low pairwise overlap counts, while high overlaps are projected onto representations with high overlap.
Thus, similar input vectors lead to a similar columnar activation, while disjunct stimuli activate distinct columns.

4.
A column must receive a minimum input (e.g. 15 bits) to become active.

The temporal memory operates on single cells within columns and further processes the spatial pooler' s output.
Temporal sequences are learned by the network and can be used for generating predictions and highlighting anomalies.
Individual cells receive stimuli from other neurons on their distal dendrites.
This lateral input provides a temporal context.
By modifying a cell' s distal connectivity, temporal sequences can be learned and predicted.
The temporal memory' s behavior can be summarized by the following:

1.
Individual cells receive lateral input on their distal dendrites.
In case a certain threshold is crossed, the cells enter a predictive (depolarized) state.

2.
When a column becomes active due to proximal input, it activates only those cells that are in predictive state.

627 :yamaguti:2019/01/14(月) 00:45:29.31 ID:tjARBhyFs ?2BP(3)
3.
When a column with no predicted cells becomes active due to proximal input, all cells in the column become active.
This phenomenon is referred to as columnar bursting.

1.2
Heidelberg Neuromorphic Computing Platform


Fig. 1.


A wafer containing 384 HICANN chips.
The undiced wafer undergoes a custom post-processing step where additional metal layers are applied to establish inter-reticle connectivity and power distribution.
(Photo courtesy of the Electronic Vision(s) group, Heidelberg.)


The HMF is a hybrid platform consisting of a traditional high-performance cluster and a neuromorphic system.
It is developed primarily at the Kirchhoff-Institute for Physics in Heidelberg and the TU Dresden while receiving funding from the BSS and HBP [HBP SP9 partners, 2014].
The platform' s core is the wafer-scale integrated High Input Count Analog Neural Network (HICANN) chip as shown in Figure 1.
Part of the chip' s unique design is its mixed-signal architecture featuring analog neuron circuits and a digital communication infrastructure.
Due to the short intrinsic time constants of the hardware neurons, the system operates on an accelerated timescale with a speed-up factor of 10 × 104 compared to biological real-time.

HICANN features 512 neurons or dendritic membrane circuits.
Each circuit can be stimulated via 226 synapses on two synaptic inputs.
As a default, the latter are configured for excitatory and inhibitory stimuli, respectively.
However, they can be set up to represent e.g. two excitatory inputs with different synaptic time constants or reversal potentials.

628 :yamaguti:2019/01/14(月) 00:47:32.33 ID:tjARBhyFs ?2BP(3)
By connecting multiple dendritic membranes larger neurons with up to 14 × 103 synapses can be formed.

A single wafer contains 384 chips with 200 × 103 neurons and 45 × 106 synapses.
Multiple wafers can be connected to form even larger networks.
The BSS' s infrastructure consists of six wafers and is being extended to 20 wafers for the first HBP milestone.

1.3
Spiking Neuron Model

There exist different techniques of varying complexity for simulating networks of spiking neurons.
The reference implementation we use for HTM networks is based on first generation, binary neurons with discrete time steps [Numenta, Inc].
Third generation models, however, incorporate the concept of dynamic time and implement inter-neuron communication based individual spikes.

Starting from the original Hodgkin-Huxley equations [Hodgkin and Huxley, 1952], multiple spiking neuron models were developed that feature different levels of detail and abstraction.
The HICANN chip implements Adaptive Exponential Integrate-and-Fire model (AdEx) neurons [Brette and Gerstner, 2005].
At its core, it represents a simple Leaky Integrate-and-Fire (LIF) model but features a detailed spiking behavior as well as spike-triggered and sub-threshold adaption.
It was found to correctly predict approximately 96 % of the spike times of a Hodgkin-Huxley-type model neuron and about 90 % of the spikes recorded from a cortical neuron [Jolivet et al., 2008].
On the HMF and thus also in the following simulations, the neurons are paired with conductance-based synapses allowing for a fine-grained control of the synaptic currents and the implementation of e.g. shunting inhibition.


2

Porting HTM Models to the Heidelberg Neuromorphic Computing Platform

629 :yamaguti:2019/01/14(月) 00:49:18.37 ID:tjARBhyFs ?2BP(3)
2
SPIKING NETWORK MODELS

Implementing neural network models for a neuromorphic hardware platform or dynamic software simulations requires an abstract network description defining the individual cell populations as well as the model' s connectivity.
For this work, our primary focus was on developing mechanistic and functional implementations of the software reference models while staying within the topological and parameter restrictions imposed by the hardware platform.
A more detailed biophysical approach should begin with simulations of single HTM neurons and their dendritic properties before advancing to more complex systems, e.g. full networks.

In the following sections we describe spatial pooler and temporal memory models that incorporate basic HTM properties.
These models are able to reproduce the fundamental behaviour of existing software implementations.

The simulations were set up in Python using the PyNN library [Davison et al., 2008].
Besides supporting a wide range of software simulators, this high-level interface is also supported by the HMF platform [Billaudelle, 2014a].
NEST was used as a simulation backend [Gewaltig and Diesmann, 2007].
To enable multiple synaptic time constants per neuron, a custom implementation of the AdEx model was written.

2.1
Spatial Pooler


Fig. 2.

C
4 I 2 3 1

630 :yamaguti:2019/01/14(月) 00:52:26.38 ID:tjARBhyFs ?2BP(3)
Timing based implementation of the spatial pooler.
Each column is represented by a single cell C and receives sparse input from the input vector 1 .
The columns become active when the number of connected active inputs crosses a threshold.
The rise time of the membrane voltage highly depends on the number of coincident inputs: cells with more presynaptic activity will fire before those with less stimuli do.
Inhibitory pool I accumulates the columnar spikes 2 and in doing so acts as a counter.
After a certain number of columns have become active, the pool will inhibit and shut down all columns preventing any further activity 3 .
To stabilize this kWTA model, all columns receive a subsampled feed-forward inhibition 4 .
This effectively prolongs the decision period for high input activity.


At its core the spatial pooler resembles a k-Winners-TakeAll (kWTA) network: k out of m columns are chosen to be active in each time step.
In fact, kWTA networks have often been mentioned as an approximation for circuits naturally occurring in the neocortex [Felch and Granger, 2008].
Continuous-time and VLSI implementations of such systems have been discussed in the literature [Erlanson and Abu-Mostafa, 1991, Tymoshchuk, 2012, Maass, 2000].
In the implementation below we describe a novel approach to maintaining stable sparsity levels even with a large number of inputs.

631 :yamaguti:2019/01/14(月) 00:53:19.87 ID:tjARBhyFs ?2BP(3)
The network developed for this purpose is presented in Figure 2.
It follows a purely time-based approach and is designed for LIF neurons.
It allows for very fast decision processes based on a single input event per source.
Each column is represented by a single cell which accumulates feed-forward input from the spike sources.
Here, the rise time of the membrane voltage decreases with the number of presynaptic events seen by the cell: cells receiving the most input will fire before the others.
An inhibitory pool consisting of a single cell collects the network' s activity.
Low membrane and high synaptic time constants lead to a reliable summation of events.
When a certain number of spikes have been collected – and thus the cell' s threshold has been crossed – the pool strongly inhibits all cells of the network suppressing subsequent spike events.

The model is extended by adding subtle feed-forward shunting inhibition.
The inhibitory conductance increases with the overall input activity νin .
With the reversal potential set to match the leakage potential, the conductance contributes to the leakage term

gl0 = gl + ginh (νin ) .

This effectively slows down the neurons' responses and thus prolongs the decision period of the network.
With this inhibition, the resulting system is able to achieve stable sparsity levels with a large number of inputs, at the cost of slightly slower response times.

Tie situations between columns receiving the same number of presynaptic events are resolved by adding slight gaussian jitter to the weights of the excitatory feed-forward connections.
This gives some columns structural advantages over other columns resulting in a slightly faster response to the same stimulus.
By increasing the standard deviation σj of the jitter, the selection criterion can be blurred.

632 :yamaguti:2019/01/14(月) 00:54:23.45 ID:tjARBhyFs ?2BP(3)
2.2
Temporal Memory

Similar to the spatial pooler, the temporal memory implementation was designed for fast reaction times and spike-timing based response patterns.
A complete network consists of m identical columns with n HTM cells each.
Modelling these cells is a challenge in itself.
A multicompartmental neuron model would represent the best fit.
While a neuromorphic hardware chip implementing such
a model is planned and first steps in that direction have already been taken [Millner, 2012], the current system does not provide this feature.


3

Billaudelle et al.


Since HTM cells primarily depend on the active properties of a compartment, it can be modelled by a triple of individual LIF cells as shown in Figure 3.


Fig. 3.

P
2 1
single predictive element
D 4 3 I 5 S

633 :yamaguti:2019/01/14(月) 00:55:12.38 ID:tjARBhyFs ?2BP(3)
Implementation of the temporal memory not including plasticity.
Every HTM cell within a column is modeled with three individual LIF cells modeling different compartments (distal dendrites D, soma S and a lateral inhibition segment I – which is not biologically inspired).
Per column, there exist multiple cell triples as well as one “head” cell P which participates in the columnar competition and collects proximal input for the whole column.
Activity of this cell is forwarded to the individual soma cells of the column 1 .
Without a previous prediction, this results in all soma cells firing.
However, the distal compartment sums over the input of the previous time step.
When a threshold is reached, the inhibitory compartment as well as the soma are depolarized 3 4 .
Together with proximal input 2 , the inhibitory partition fires and inhibits all other cells in the column 5 .


A column collects proximal input using a single cell.
In fact, this cell can be part of a spatial pooler network as presented in section 2.1.
When the column becomes active, this cell emits a spike and excites both the neurons representing the HTM cells' somae as well as inhibitory cells.
The inhibitory projection, however, is not strong enough to activate the target compartment alone.
Instead, it only leads to a partial depolarization.
The soma neuron, however, reaches the firing threshold for a single presynaptic event.
This suffices as a columnar bursting mechanism (i.e. temporal memory property 3): without predictive input, all soma compartments will fire as a response to the proximal stimulus.

634 :yamaguti:2019/01/14(月) 00:56:33.22 ID:tjARBhyFs ?2BP(3)
Distal input is processed for each cell individually by their dendritic segment compartments.
A cell' s dendritic segment receives input from other cells' somae.
When its firing threshold is crossed, it partly depolarizes the inhibitory helper cell of the same triplet.
This synaptic projection is set up with a relatively long synaptic time constant and a reversal potential matching the threshold voltage.
This ensures that the predictive state is carried to the next time step and prohibits the cell from becoming active due to distal input alone.
On proximal input, the already depolarized helper cell fires before the somatic compartments.
The latter are then inhibited instantly, with the exception of the own triplet' s soma.
As described, this basic predictive mechanism fails when multiple cells are predicted, since the inhibitory compartments laterally inhibit every cell.
The solution is to also depolarize the somatic compartments of predicted cells.
In summary this mechanism satisfies temporal memory properties 1 and 2.


3
RESULTS


The network models presented in the previous section were simulated in software to investigate their behavior.

In the following, respective experiments and their results are shown.
Additionally, plasticity rules and topological requirements are discussed in respect of the HMF.

3.1
Network Simulations

635 :yamaguti:2019/01/14(月) 00:57:21.94 ID:tjARBhyFs ?2BP(3)
Fig. 5.

80
70
60
50
40
30
20
10
0
#

Input Events
0 20 30 40 50 0 60

All Columns
Active Columns

Histogram showing the distribution of overlap scores individual columns receive.
Columns activated by the spatial pooler network are highlighted.
This confirms that only competitors with the highest input enter an active state.
Furthermore, tie situations between columns with the same overlap score are resolved correctly.


The spatial pooler was analyzed for a network spanning 1,000 columns and an input vector of size 10,000.
To speed up the simulation, the input connectivity was preprocessed in software by multiplying the stimulus vector to the connectivity matrix.

636 :yamaguti:2019/01/14(月) 00:58:41.82 ID:tjARBhyFs ?2BP(3)
A first experiment was designed to verify the basic kWTA functionality.
A random pattern was presented to the network.
The number of active inputs per column – the input overlap score – can be visualized in a histogram as shown in Figure 5.
By highlighting the columns activated by that specific stimulus, one can investigate the network' s selection criteria.
Complying with the requirements for a spatial pooler, only the rightmost bars – representing columns with the highest input counts – are highlighted.

Furthermore, the model' s capability to resolve ties between columns receiving the same input counts is demonstrated: the bar at the decision boundary was not selected as a whole but only a few columns were picked.
This verifies spatial pooler property 2.


Fig. 6.

6
5
4
3
2
1
Output Sparsity [%]

Input Sparsity [%]
1 2 3 4 5 6

The relative number of active columns is plotted against the input vector' s sparsity.
After a certain level of input sparsity is reached, columns start to enter active states.
With higher presynaptic activity, columnar competition increases and the output sparsity reaches a plateu.
The curve' s exact course can be manipulated through the neurons' parameters as can the size of the plateau.
Error bars indicate the standard deviation across five trials.


In a second scenario, input vectors with varying sparsity were fed into the network, as shown in Figure 6.
The number of active columns stays approximately constant across a wide range of input sparsity.

637 :yamaguti:2019/01/14(月) 00:59:37.59 ID:tjARBhyFs ?2BP(3)
4


Fig. 7.

1.0
0.8
0.6
0.4
0.2
0.0
Output Overlap

0.0 0.2 0.4 0.6 0.8 1.0 Input Overlap

reference data
simulation 1
simulation 2
simulation 3
simulation 4
simulation 5

Output overlap as a dependency of the input vector' s overlap score.
In each of the five simulation runs, the stimulus' was gradually changed starting from a random vector.
As required for a spatial pooler, two similar input stimuli get mapped to similar output patterns, while disjunct input vectors result in low overlap scores.
The simulations fully reproduce data from an existing software implementation which is also shown in this figure.


Additionally the plot shows that columns must receive a minimum amount of input to become active at all.
This verifies the underlaying kWTA approach as well as spatial pooler properties 1 and 4.

638 :yamaguti:2019/01/14(月) 01:01:01.52 ID:tjARBhyFs ?2BP(3)
To verify the general functionality of a spatial pooler, expressed in property 3, a third experiment was set up.
Input data sets with a variable overlap were generated starting from an initial random binary vector.
For each stimulus, the overlap of the columnar activity with the initial dataset was calculated while sweeping the input' s overlap.
The resulting relation of input and output overlap scores is shown in Figure 7.
Also included are the results of a similar experiment performed with a custom Python implementation of the spatial pooler directly following the original specification [Hawkins et al., 2011].
Multiple simulation runs all yielded results perfectly matching the reference data, thus verifying property 3.

The experiments have shown that the model presented in this section does fulfill the requirements for a spatial pooler and can be considered a solid kWTA implementation.
The specific results of course depend on the individual network size and configuration.
In this case, the network – most importantly the columnar neurons' time constants – was configured for a relatively short time step of T = 50 ms.

By choosing different parameter sets, the network can be tuned towards different operational scenarios, e.g. further increasing the model' s stability.

The temporal memory was verified in a first sequence prediction experiment.
A reference software implementation was trained with three disjunct sequences of length three.
Consecutive sequences were separated by a random input pattern.
The trained network' s lateral connectivity was dumped and loaded in a simulation.
When presented with the same stimulus, the LIF-based implementation was able to correctly predict sequences, as shown in Figure 8.


5

639 :yamaguti:2019/01/14(月) 01:01:42.08 ID:tjARBhyFs ?2BP(3)
3.2 Learning Algorithms


Fig. 9.

1.0
0.8
0.6
0.4
0.2
0.0
Output Overlap
0.0 0.2 0.4 0.6 0.8 1.0 Input Overlap

reference data
simulation 1
simulation 2
simulation 3
simulation 4
simulation 5

Dependency of output and input overlap for a trained spatial pooler.
Results of five independent simulation runs are shown as well as reference data from a custom software implementation.


Implementing online learning mechanisms in neuromorphic hardware is a challenge, especially for accelerated systems.

Although the HMF features implementations of nearestneighbor Spike-Timing Dependent Plasticity (STDP) and Short Term Plasticity (STP) [Friedmann, 2013a, Billaudelle, 2014b], more complex update algorithms are hard to implement.
Numenta' s networks rely on structural plasticity rules which go beyond these mechanisms.

640 :yamaguti:2019/01/14(月) 01:07:19.34 ID:tjARBhyFs ?2BP(3)
The spatial pooler' s stimulus changes significantly for learned input patterns.
Verification of its functionality under these conditions is important.
In order to follow the HTM specification as closely possible, a supervised update rule was implemented in an outer loop: for each time step, a matrix containing the connections' permanence values is updated according to the activity patterns

This allows us to implement the concepts of structural plasticity presented in the original whitepaper.
For the target platform, the learning algorithms could be implemented on the Plasticity Processing Unit (PPU) which is planned for the next version of the HICANN chip [Friedmann, 2013b].

Simulation results of the implementation described above are shown in Figure 9.

Experiments to replace the HTM structural plasticity rules by a classic nearest-neighbor STDP model did not yield the desired results.
The HTM learning rules require negative modifications to inactive synapses in segments that contribute to cell activity.
In contrast, STDP does not affect inactive synapses.


--
: for each time step, a matrix containing the connections' permanence values is updated according to the activity patterns of the previous time step.

641 :yamaguti:2019/01/14(月) 01:13:11.19 ID:tjARBhyFs ?2BP(3)
3.3 Map and Route
Applying abstract network models to the hardware platform requires algorithms for placing the neuron populations and routing the synaptic connections.
In a best-case scenario, this processing step results in an isomorphic projection of the network graph to the hardware topology.
For networks with extreme connectivity requirements, however, synaptic losses must be expected.

Mapping the simulated networks does not represent a challenge for the routing algorithms.
The temporal memory can be projected to a single wafer without synaptic loss.
The same still applies with assumed lateral all-to-all connectivity resulting in approximately 2 million synapses.
The latter network corresponds to a network with a potential pool of 100 % which would allow the exploration of learning algorithms even without creating and pruning hardware synapses.

On the hardware platform, a tradeoff between the number of afferent connections per cell and the number of neurons must be taken into consideration: while it is possible to connect the dendritic membrane circuits
such that a single neuron can listen on roughly 14 × 103 synases, such a network could only consist of approximately 3 × 103 neurons per wafer.
With just 226 synapses, just under 200 × 103 neurons can be allocated per wafer.

Scaling up the proof-of-concept models to a size useful for production purposes, however, challenges the hardware topology as well as the projection algorithms.


--
: while it is possible to connect the dendritic membrane circuits such that a single neuron can listen on roughly 14 × 103 synases, such a network could only consist of approximately 3 × 103 neurons per wafer.

642 :yamaguti:2019/01/14(月) 01:14:12.87 ID:tjARBhyFs ?2BP(3)
A minimal useful HTM network spans 1024 columns with 8 cells each.
In such a scenario the neurons would receive lateral input on 32 dendritic segments.
Allowing approximately 1 × 103 afferent connections per dendritic segment, this network could be realized on approximately 1 × 106 dendritic membrane circuits, or six wafers.
The existing system set up for the BSS would suffice for this scenario.
Even larger networks could be brought to the HBP' s platform.


4
CONCLUSION AND OUTLOOK

Implementing machine intelligence algorithms as spiking neural networks and porting them to a neuromorphic hardware platform presents high demands in terms of precision and scalability.

We have shown in this paper that HTMs can be successfully modeled in dynamic simulations.
The basic functionality of spatial pooler and temporal memory networks could be reproduced based on AdEx neurons.

In theory, the proof of concept networks can be easily transferred to the HMF, since the high-level software interfaces are designed to be interchangable.
Of course, emulating the models on the actual hardware platform will bring up a new set of challenges.

Adapting the HTM' s learning rules to the native plasticity features available on the HMF has turned out to be nontrivial.

The learning rules could not be replicated with the current implementation of classic STDP.


6

643 :yamaguti:2019/01/14(月) 01:15:00.80 ID:tjARBhyFs ?2BP(3)
As a freely programmable microprocessor directly embedded into the neuromorphic core, the PPU provides the ability to extend the system' s plasticity mechanisms in order to implement the HTM rules.
Further investigation is required to map out a complete implementation of the HTM update rules on the PPU.

Analog neuromorphic hardware is susceptible to transistor mismatches due to e.g. dopand fluctuations in the production process [Mihai A.
Petrovici, 2014].
A careful calibration of the individual neurons is required to compensate for these variations.
Due to the complexity of the problem and the high number of interdependent variables, a perfect calibration is hard to accomplish.
Therefore, network models are required to be tolerant regarding certain spatial, and trialto-trial variations on the computing substrate.
Carrying out additional Monte Carlo simulations with slightly randomized parameters is important to investigate the robustness of the presented networks.

Finally, a multicompartmental neuron model is planned for a later version of the neurmorphic platform.
Making use of this extended feature set will significantly increase the level of biophysical detail.
This will account for the detailed dendritic model used in HTMs and help to stay closer to the whitepaper as well as the reference implementation.

Besides paving the road towards a highly accelerated execution of HTM models, the HMF offers a high level of detail in its neuron implementation.
With the multicompartmental extension and a flexible plasticity framework, we anticipate the platform will prove valuable as a tool for further low-level research on HTM theories.


ACKNOWLEDGEMENTS

Special thanks to Jeff Hawkins, Prof.
Dr. Karlheinz Meier, Paxon Frady, and the Numenta Team.

644 :yamaguti:2019/01/14(月) 01:15:44.36 ID:tjARBhyFs ?2BP(3)
REFERENCES

Sebastian Billaudelle.
PyHMF – eine PyNN-kompatible Schnittstelle für das HMF-System, 2014a.

Sebastian Billaudelle.
Characterisation and calibration of short term plasticity on a neuromorphic hardware chip.
Bachelor' s thesis, Universität Heidelberg, 2014b.

Romain Brette and Wulfram Gerstner.
Adaptive exponential integrate-and-fire model as an effective description of neuronal activity.
Journal of neurophysiology, 94(5): 3637–3642, 2005.

Andrew P Davison, Daniel Brüderle, Jochen Eppler, Jens Kremkow, Eilif Muller, Dejan Pecevski, Laurent Perrinet, and Pierre Yger.
Pynn: a common interface for neuronal network simulators.
Frontiers in neuroinformatics, 2, 2008.

Ruth Erlanson and Yaser S Abu-Mostafa.
Analog neural networks as decoders.
In Advances in neural information processing systems, pages 585–588, 1991.

Andrew C Felch and Richard H Granger.
The hypergeometric connectivity hypothesis: Divergent performance of brain circuits with different synaptic connectivity distributions.
Brain research, 1202:3–13, 2008.

Simon Friedmann.
A new approach to learning in neuromorphic hardware.
PhD thesis, Heidelberg, Univ., Diss., 2013, 2013a.

Simon Friedmann.
A new approach to learning in neuromorphic hardware.
PhD thesis, Heidelberg, Univ., Diss., 2013, 2013b.

645 :yamaguti:2019/01/14(月) 01:16:25.94 ID:tjARBhyFs ?2BP(3)
SB Furber, F Galluppi, S Temple, and LA Plana.
The spinnaker project.
0018-9219, (99):1–14, 2014.

Marc-Oliver Gewaltig and Markus Diesmann.
Nest (neural simulation tool).
Scholarpedia, 2(4):1430, 2007.

Jeff Hawkins, Subutai Ahmad, and Donna Dubinsky.
Cortical Learning Algorithm and Hierarchical Temporal Memory, 2011.
URL http://numenta.org/resources/HTM_CorticalLearningAlgorithms.pdf
.

HBP SP9 partners.
Neuromorphic Platform Specification.
Human Brain Project, March 2014.

A.L. Hodgkin and A.F. Huxley.
A quantitative description of membrane current and its application to conduction and excitation in nerve.
Journal of Physiology, 117:500–544, 1952.

Renaud Jolivet, Felix Schürmann, Thomas K Berger, Richard Naud, Wulfram Gerstner, and Arnd Roth.
The quantitative single-neuron modeling competition.
Biological cybernetics, 99(4-5):417–426, 2008.

Wolfgang Maass.
Neural computation with winner-take-all as the only nonlinear operation.
Citeseer, 2000.

646 :yamaguti:2019/01/14(月) 01:17:03.52 ID:tjARBhyFs ?2BP(3)
Paul Müller Oliver Breitwieser Mikael Lundqvist Lyle Muller Matthias Ehrlich Alain Destexhe Anders Lansner RenÃľ Schüffny Johannes Schemmel Karlheinz Meier Mihai A.
Petrovici, Bernhard Vogginger.
Characterization and compensation of network-level anomalies in mixedsignal neuromorphic modeling platforms.
PLOS ONE, October 2014.
doi: dx.doi.org/10.1371/journal.pone.0108590.
URL http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0108590
.

Sebastian Millner.
Development of a Multi-Compartment Neuron Model Emulation.
PhD thesis, Heidelberg, Univ., Diss., 2013, 2012.

Numenta, Inc.
Numenta Platform for Intelligent Computing (NuPIC).
URL http://numenta.org/
.

PavloV. Tymoshchuk.
A continuous-time model of analogue k-winners-take-all neural circuit.
In Chrisina Jayne, Shigang Yue, and Lazaros Iliadis, editors, Engineering Applications of Neural Networks, volume 311 of Communications in Computer and Information Science, pages 94–103.
Springer Berlin Heidelberg, 2012.
ISBN 978-3-642-32908-1.


7

647 :yamaguti:2019/01/14(月) 01:17:46.13 ID:tjARBhyFs ?2BP(3)
Fig. 4.

Cell 1

−-50
−-55
−-60
−-65
−-70
Vdistal [mV]

−-50
−-55
−-60
−-65
−-70
Vinh [mV]

−-50
−-55
−-60
−-65
−-70
Vsoma [mV]

648 :yamaguti:2019/01/14(月) 01:18:16.55 ID:tjARBhyFs ?2BP(3)
Cell 2

−-50
−-55
−-60
−-65
−-70
Vdistal [mV]

−-50
−-55
−-60
−-65
−-70
Vinh [mV]

−-50
−-55
−-60
−-65
−-70
Vsoma [mV]

649 :yamaguti:2019/01/14(月) 01:18:48.64 ID:tjARBhyFs ?2BP(3)
Cell 3

−-50
−-55
−-60
−-65
−-70
Vdistal [mV]

−-50
−-55
−-60
−-65
−-70
Vinh [mV]

−-50
−-55
−-60
−-65
−-70
Vsoma [mV]

t [ms]
0 50 100 150 200 250 300 350 400

Neuron traces for a temporal memory column containing three HTM cells.
Each of these cells is represented by a somatic compartment, an inhibitory helper cells and two dendritic segments.
The column is activated by proximal input in every time step and receives random distal stimulus predicting none, one or more cells per step.
As indicated by the automatic classification algorithm, the column exhibits a correct response pattern to these predictions.

650 :yamaguti:2019/01/14(月) 01:19:26.40 ID:tjARBhyFs ?2BP(3)
8



Fig. 8.

Cell Index

7
6
5
4
3
2
1
0

a1

b1

c1

651 :yamaguti:2019/01/14(月) 01:20:09.82 ID:tjARBhyFs ?2BP(3)
Cell Index

7
6
5
4
3
2
1
0

a2

b2

c2

652 :yamaguti:2019/01/14(月) 01:20:41.85 ID:tjARBhyFs ?2BP(3)
Cell Index

7
6
5
4
3
2
1
0

a3

b3

c3

653 :yamaguti:2019/01/14(月) 01:21:23.95 ID:tjARBhyFs ?2BP(3)
Cell Index

7
6
5
4
3
2
1
0

random

random

random


Column Index
0 20 40 60 80 100 120

Column Index
0 20 40 60 80 100 120

Column Index
0 20 40 60 80 100 120

654 :yamaguti:2019/01/14(月) 01:21:58.81 ID:tjARBhyFs ?2BP(3)
A LIF neuron based temporal memory implementation correctly predicting different patterns.
Predicted cells are marked blue, active cells in purple.
The network spans 128 columns with each of their eight HTM cells collecting distal stimuli via two dendritic segments.

Connectivity for the distal inputs was configured externally.
The model was presented three disjunct sequences of size three.
The individual patterns were separated by a random input Sparse Distributed Representation (SDR).


9

655 :YAMAGUTIseisei:2019/01/20(日) 23:48:47.88 ID:x0oJYv1FQ ?2BP(3)
>>632
While a neuromorphic hardware chip implementing such a model is planned and first steps in that direction have already been taken [Millner, 2012], the current system does not provide this feature.
>>644
– eine PyNN-kompatible Schnittstelle für das
>>646
Schüffny Johannes Schemmel Karlheinz Meier Mihai A. Petrovici, Bernhard Vogginger. Petrovici、Bernhard Vogginger。



http://numenta.com/press/2019/01/14/numenta-publishes-breakthrough-theory-for-intelligence-and-cortical-computation/
http://numenta.com/neuroscience-research/research-publications/papers/a-framework-for-intelligence-and-cortical-function-based-on-grid-cells-in-the-neocortex/
http://doi.org/10.3389/fncir.2018.00121
http://www.frontiersin.org/articles/10.3389/fncir.2018.00121/full

Open Supplemental Data
Hypothesis and Theory ARTICLE
Front. Neural Circuits, 11 January 2019 | https://doi.org/10.3389/fncir.2018.00121
A Framework for Intelligence and Cortical Function Based on Grid Cells in the Neocortex
Jeff Hawkins*, Marcus Lewis, Mirko Klukas, Scott Purdy and Subutai Ahmad

* Numenta, Inc., Redwood City, CA, United States

656 :YAMAGUTIseisei:2019/07/28(日) 15:25:58.33 ID:WB6ziRYun
http://webcache.googleusercontent.com/search?q=cache:arxiv.org/pdf/1511.00083


Why Neurons Have Thousands of Synapses, A Theory of Sequence Memory in Neocortex

Jeff Hawkins*, Subutai Ahmad Numenta, Inc, Redwood City, California, United States of America
*Corresponding author Emails: jhawkinsATnumenta , sahmadATnumenta


Keywords: neocortex, prediction, neocortical theory, active dendrites, sequence memory.

A version of this manuscript has been submitted for publication as of October 30, 2015.
Note that figures and tables are at the end of this PDF.
Please contact the authors for updated citation information.


1

657 :YAMAGUTIseisei:2019/07/28(日) 15:39:06.64 ID:WB6ziRYun
Abstract
Neocortical neurons have thousands of excitatory synapses.
It is a mystery how neurons integrate the input from so many synapses and what kind of large-scale network behavior this enables.
It has been previously proposed that non-linear properties of dendrites enable neurons to recognize multiple patterns.
In this paper we extend this idea by showing that a neuron with several thousand synapses arranged along active dendrites can learn to accurately and robustly recognize hundreds of unique patterns of cellular activity,
even in the presence of large amounts of noise and pattern variation.
We then propose a neuron model where some of the patterns recognized by a neuron lead to action potentials and define the classic receptive field of the neuron,
whereas the majority of the patterns recognized by a neuron act as predictions by slightly depolarizing the neuron without immediately generating an action potential.

--
we extend this idea by showing that a neuron with several thousand synapses arranged along active dendrites recognize hundreds of unique patterns of cellular activity, even in the presence of large amounts of noise and pattern variation.

We then propose a neuron model where some of the patterns recognized by a define the classic receptive field of the neuron, whereas the majority of the patterns recognized by a neuron act as predictions by slightly depolarizing the neuron .

a model where some of the patterns recognized by a neuron lead to action potentials , whereas the majority of the patterns recognized by a neuron act as predictions by slightly depolarizing the neuron without immediately generating an action potential.

658 :YAMAGUTIseisei:2019/07/28(日) 15:40:31.28 ID:WB6ziRYun
We then present a network model based on neurons with these properties and show that the network learns a robust model of time-based sequences.
Given the similarity of excitatory neurons throughout the neocortex and the importance of sequence memory in inference and behavior, we propose that this form of sequence memory is a universal property of neocortical tissue.
We further propose that cellular layers in the neocortex implement variations of the same sequence memory algorithm to achieve different aspects of inference and behavior.
The neuron and network models we introduce are robust over a wide range of parameters as long as the network uses a sparse distributed code of cellular activations.
The sequence capacity of the network scales linearly with the number of synapses on each neuron.
Thus neurons need thousands of synapses to learn the many temporal patterns in sensory stimuli and motor sequences.

659 :YAMAGUTIseisei:2019/07/28(日) 15:44:17.63 ID:WB6ziRYun
1.
Introduction
Excitatory neurons in the neocortex have thousands of excitatory synapses.
The proximal synapses, those closest to the cell body, have a relatively large effect on the likelihood of a cell generating an action potential.
However, a majority of the synapses are distal, or far from the cell body.
The activation of a single distal synapse has little effect at the soma, and for many years it was hard to imagine how the thousands of distal synapses could play an important role in determining a cell’s responses (Major et al., 2013).
We now know that dendrite branches are active processing elements.
The activation of several distal synapses within close spatial and temporal proximity can lead to a local dendritic NMDA spike and consequently a significant and sustained depolarization of the soma (Antic et al., 2010; Major et al., 2013).
This has led some researchers to suggest that dendritic branches act as independent pattern recognizers (Poirazi et al., 2003; Polsky et al., 2004).
Yet, despite the many advances in understanding the active properties of dendrites, it remains a mystery why neurons have so many synapses and what their precise role is in memory and cortical processing.

660 :YAMAGUTIseisei:2019/07/28(日) 15:45:38.56 ID:WB6ziRYun
Lacking a theory of why neurons have active dendrites, almost all artificial neural networks, such as those used in deep learning (LeCun et al., 2015) and spiking neural networks (Maass, 1997),
use artificial neurons without active dendrites and with unrealistically few synapses, strongly suggesting they are missing key functional aspects of real neural tissue.
If we want to understand how the neocortex works and build systems that work on the same principles as the neocortex, we need an understanding of how biological neurons use their thousands of synapses and active dendrites.
Of course, neurons cannot be understood in isolation.
We also need a complementary theory of how networks of neurons, each with thousands of synapses, work together towards a common purpose.

--
Lacking a theory , almost all artificial neural networks, use artificial neurons without active dendrites and with unrealistically few synapses, strongly suggesting they are missing key functional aspects of real neural tissue.
Lacking a theory of why neurons have active dendrites, almost all artificial neural networks, such as those used in deep learning (LeCun et al., 2015) and spiking neural networks (Maass, 1997), use artificial neurons .

661 :YAMAGUTIseisei:2019/07/28(日) 15:50:46.45 ID:WB6ziRYun
In this paper we introduce such a theory.
First, we show how a typical pyramidal neuron with active dendrites and thousands of synapses can recognize hundreds of unique patterns of cellular activity.
We show that a neuron can recognize hundreds of patterns even in the presence of large amounts of noise and variability as long as overall neural activity is sparse.
Next we introduce a neuron model where the inputs to different parts of the dendritic tree serve different purposes.
In this model the patterns recognized by a neuron’s distal synapses are used for prediction.
Each neuron learns to recognize hundreds of patterns that often precede the cell becoming active.
The recognition of any one of these learned patterns acts as a prediction by depolarizing the cell without directly causing an action potential.
Finally, we show how a network of neurons with this property will learn and recall sequences of patterns.
The network model relies on depolarized neurons firing quickly and inhibiting other nearby neurons, thus biasing the network’s activation towards its predictions.
Through simulation we illustrate that the sequence memory network exhibits numerous desirable properties such as on-line learning, multiple simultaneous predictions, and robustness.

Given the similarity of neurons throughout the neocortex and the importance of sequence memory for inference and behavior,
we propose that sequence memory is a property of neural tissue throughout the neocortex and thus represents a new and important unifying principle for understanding how the neocortex works.

--
Given the similarity of neurons throughout the neocortex and the importance of sequence memory for inference and behavior, we propose that sequence memory is a property of neural tissue throughout the neocortex .

662 :YAMAGUTIseisei:2019/07/28(日) 15:55:33.31 ID:WB6ziRYun
2.
Results

2.1.
Neurons Recognize Multiple Patterns It is common to think of a neuron as recognizing a single pattern of activity on its synapses.
This notion, sometimes called a “point neuron”, forms the basis of almost all artificial neural networks (Fig. 1A).

[Figure 1 about here see end of manuscript]

Active dendrites suggest a different view of the neuron, where neurons recognize many unique patterns (Larkum and Nevian, 2008; Poirazi et al., 2003; Polsky et al., 2004).
Experimental results show that the coincident activation of eight to twenty synapses in close spatial proximity on a dendrite will combine in a non-linear fashion and cause an NMDA dendritic spike
(Larkum et al., 1999; Major et al., 2013; Schiller and Schiller, 2001; Schiller et al., 2000).
Thus, a small set of neighboring synapses acts as a pattern detector.
It follows that the thousands of synapses on a cell’s dendrites act as a set of independent pattern detectors.


2



--
Experimental results show that the coincident activation of eight to twenty synapses in close spatial proximity on a dendrite will combine in a non-linear fashion and cause an NMDA dendritic spike (Larkum et al., 1999; Major et al., 2013 ).

663 :YAMAGUTIseisei:2019/07/28(日) 16:00:59.36 ID:WB6ziRYun
The detection of any of these patterns causes an NMDA spike and subsequent depolarization at the soma.

It might seem that eight to twenty synapses could not reliably recognize a pattern of activity in a large population of cells.
However, robust recognition is possible if the patterns to be recognized are sparse; i.e. few neurons are active relative to the population (Olshausen and Field, 2004).
For example, consider a population of 200K cells where 1% (2,000) of the cells are active at any point in time.
We want a neuron to detect when a particular pattern occurs in the 200K cells.
If a section of the neuron' s dendrite forms new synapses to just 10 of the 2,000 active cells, and the threshold for generating an NMDA spike is 10, then the dendrite will detect the target pattern when all 10 synapses receive activation at the same time
Note that the dendrite could falsely detect many other patterns that share the same 10 active cells.
However, if the patterns are sparse, the chance that the 10 synapses would become active for a different random pattern is small.
In this example it is only 9.8 x 10^-21.

The probability of a false match can be calculated precisely as follows.
Let n represent the size of the cell population and A the number of active cells in that population at a given point in time, for sparse patterns A ≪ n .
Let s be the number of synapses on a dendritic segment and θ be the NMDA spike threshold.
We say the segment recognizes a pattern if at least θ synapses become active, i.e. at least θ of the s synapses match the currently active cells.

--
If a section of the neuron' s dendrite forms new synapses to just 10 of the 2,000 active cells,and the threshold for generating an NMDA spike is 10, then the dendrite will detect the target pattern when all 10 synapses receive activation at the same time.

664 :YAMAGUTIseisei:2019/07/28(日) 16:02:04.52 ID:WB6ziRYun
Assuming a random distribution of patterns, the exact probability of a false match is given by:


! !!!
×

(1)


The denominator is simply the total number of possible patterns containing A active cells in a population of N total cells.
The numerator counts the number of patterns that would connect to θ or more of the s synapses on one dendritic segment.
A more detailed description of this equation can be found in (Ahmad and Hawkins, 2015).

The equation shows that a non-linear dendritic segment can robustly classify a pattern by sub-sampling (forming synapses to only a small number of the cells in the pattern to be classified).
Table A in S1 Text lists representative error probabilities calculated from Eq.(1).

By forming more synapses than necessary to generate an NMDA spike, recognition becomes robust to noise and variation.
For example, if a dendrite has an NMDA spike threshold of 10, but forms 20 synapses to the pattern it wants to recognize, twice as many as needed, it allows the dendrite to recognize the target pattern even if 50% of the cells are changed or inactive.
The extra synapses also increase the likelihood of a false positive error.
Although the chance of error has increased, Eq.(1) shows that it is still tiny when the patterns are sparse.
In the above example, doubling the number of synapses and hence introducing a 50% noise tolerance, increases the chance of error to only 1.6 x 10^-18 .
Table 1B in S1 Text lists representative error rates when the number of synapses exceeds the threshold.

665 :YAMAGUTIseisei:2019/07/28(日) 16:08:34.81 ID:WB6ziRYun
The synapses recognizing a given pattern have to be co-located on a dendritic segment.
If they lie within 40μm of each other then as few as eight synapses are sufficient to create an NMDA spike (Major et al., 2008).
If the synapses are spread out along the dendritic segment, then up to twenty synapses are needed (Major et al., 2013).
A dendritic segment can contain several hundred synapses; therefore each segment can detect multiple patterns.
If synapses that recognize different patterns are mixed together on the dendritic segment, it introduces an additional possibility of error by co-activating synapses from different patterns.
The probability of this type of error depends on how many sets of synapses share the dendritic segment and the sparsity of the patterns to be recognized.
For a wide range of values the chance for this type of error is still low (Table C in S1 Text).
Thus the placement of synapses to recognize a particular pattern is somewhat precise (they must be on the same dendritic segment and ideally within 40μm of each other), but also somewhat imprecise (mixing with other synapses is unlikely to cause errors).

If we assume an average of 20 synapses are allocated to recognize each pattern, and that a neuron has 6,000 synapses, then a cell would have the ability to recognize approximately 300 different patterns.
This is a rough approximation, but makes evident that a neuron with active dendrites can learn to reliably recognize hundreds of patterns within a large population of cells.
The recognition of any one of these patterns will depolarize the cell.
Since all excitatory neurons in the neocortex have thousands of synapses, and, as far as we know, they all have active dendrites, then each and every excitatory neocortical neuron recognizes hundreds of patterns of neural activity.

666 :YAMAGUTIseisei:2019/07/28(日) 16:24:14.55 ID:WB6ziRYun
In the next section we propose that most of the patterns recognized by a neuron do not directly lead to an action potential, but instead play a role in how networks of neurons make predictions and learn sequences.

2.1.1.
Three Sources of Synaptic Input to Cortical Neurons

Neurons receive excitatory input from different sources that are segregated on different parts of the dendritic tree.
Fig. 1B shows a typical pyramidal cell, the most common excitatory neuron in the neocortex.
We show the input to the cell divided into three zones.
The proximal zone receives feedforward input.
The basal zone receives contextual input, mostly from nearby cells in the same cortical region (Petreanu et al., 2009; Rah et al., 2013; Yoshimura et al., 2000).
The apical zone receives feedback input (Spruston, 2008).
(The second most common excitatory neuron in the neocortex is the spiny stellate cell; we suggest they be considered similar to pyramidal cells minus the apical dendrites.)
We propose the three zones of synaptic integration on a neuron (proximal, basal, and apical) serve the following purposes.

Proximal Synapses Define the Classic Receptive Field of a Cell
The synapses on the proximal dendrites (typically several hundred) have a relatively large effect at the soma and therefore are best situated to define the basic receptive field response of the neuron (Spruston, 2008).


3

667 :YAMAGUTIseisei:2019/07/28(日) 16:29:43.09 ID:WB6ziRYun
If the coincident activation of a subset of the proximal synapses is sufficient to generate a somatic action potential and If the inputs to the proximal synapses are sparsely active,
then the proximal synapses will recognize multiple unique feedforward patterns in the same manner as discussed earlier.
Therefore, the feedforward receptive field of a cell can be thought of as a union of feedforward patterns.

Basal Synapses Learn Transitions in Sequences

We propose that basal dendrites of a neuron recognize patterns of cell activity that precede the neuron firing, in this way the basal dendrites learn and store transitions between activity patterns.
When a pattern is recognized on a basal dendrite it generates an NMDA spike.
The depolarization due to an NMDA spike attenuates in amplitude by the time it reaches the soma, therefore when a basal dendrite recognizes a pattern it will depolarize the soma but not enough to generate a somatic action potential
(Antic et al., 2010; Major et al., 2013).
We propose this sub-threshold depolarization is an important state of the cell.
It represents a prediction that the cell will become active shortly and plays an important role in network behavior.
A slightly depolarized cell fires earlier than it would otherwise if it subsequently receives sufficient feedforward input.
By firing earlier it inhibits neighboring cells, creating highly sparse patterns of activity for correctly predicted inputs.
We will explain this mechanism more fully in a later section.

--
If the coincident activation of a subset of the proximal synapses is sufficient to generate a somatic action potential and If the synapses are active, then the proximal synapses will recognize multiple unique feedforward patterns .
If the coincident activation is sufficient and If the inputs to the proximal synapses are sparsely active, then the proximal synapses will recognize multiple unique feedforward patterns in the same manner as discussed earlier.

668 :YAMAGUTIseisei:2019/07/28(日) 16:38:16.16 ID:WB6ziRYun
Apical Synapses Invoke a Top-down Expectation

The apical dendrites of a neuron also generate NMDA spikes when they recognize a pattern (Cichon and Gan, 2015).
An apical NMDA spike does not directly affect the soma.
Instead it can lead to a Ca2+ spike in the apical dendrite (Golding et al., 1999; Larkum et al., 2009).
A single apical Ca2+ spike will depolarize the soma, but typically not enough to generate a somatic action potential (Antic et al., 2010).
The interaction between apical Ca2+ spikes, basal NMDA spikes, and somatic action potentials is an area of ongoing research (Larkum, 2013),
but we can say that under many conditions a recognized pattern on an apical dendrite will depolarize the cell and therefore have a similar effect as a recognized pattern on a basal dendrite.
We propose that the depolarization caused by the apical dendrites is used to establish a top-down expectation, which can be thought of as another form of prediction.

2.1.2.
The HTM Model Neuron

Fig. 1C shows an abstract model of a pyramidal neuron we use in our software simulations.
We model a cell’s dendrites as a set of threshold coincidence detectors; each with its own synapses.
If the number of active synapses on a dendrite/coincidence detector exceeds a threshold the cell detects a pattern.
The coincidence detectors are in three groups corresponding to the proximal, basal, and apical dendrites of a pyramidal cell.
We refer to this model neuron as an “HTM neuron” to distinguish it from biological neurons and point neurons.
HTM is an acronym for Hierarchical Temporal Memory, a term used to describe our models of neocortex (Hawkins et al., 2011).
HTM neurons used in the simulations for this paper have 128 dendrite/coincidence detectors with up to 40 synapses per dendrite.
For clarity, Fig. 1C shows only a few dendrites and synapses.

--
The interaction is an area of ongoing research , but we can say that under many conditions a recognized pattern .

669 :YAMAGUTIseisei:2019/07/28(日) 16:44:56.98 ID:WB6ziRYun
2.2.
Networks of Neurons Learn Sequences

Because all tissue in the neocortex consists of neurons with active dendrites and thousands of synapses, it suggests there are common network principles underlying everything the neocortex does.
This leads to the question, what network property is so fundamental that it is a necessary component of sensory inference, prediction, language, and motor planning?

We propose that the most fundamental operation of all neocortical tissue is learning and recalling sequences of patterns (Hawkins and Blakeslee, 2004),
what Karl Lashley famously called “the most important and also the most neglected problem of cerebral physiology” (Lashley, 1951).
More specifically, we propose that each cellular layer in the neocortex implements a variation of a common sequence memory algorithm.
We propose cellular layers use sequence memory for different purposes, which is why cellular layers vary in details such as size and connectivity.
In this paper we illustrate what we believe is the basic sequence memory algorithm without elaborating on its variations.

We started our exploration of sequence memory by listing several properties required of our network in order to model the neocortex.
1)
On-line learning
Learning must be continuous.
If the statistics of the world change, the network should gradually and continually adapt with each new input.
2)
High-order predictions
Making correct predictions with complex sequences requires the ability to incorporate contextual information from the past.
The network needs to dynamically determine how much temporal context is needed to make the best predictions.
The term “high-order” refers to “high-order Markov chains” which have this property.

--
We propose that the most fundamental operation of all neocortical tissue is learning and recalling sequences of patterns , what Karl Lashley famously called `` the most important and also the most neglected problem '' .

670 :YAMAGUTIseisei:2019/07/28(日) 16:45:53.96 ID:WB6ziRYun
3)
Multiple simultaneous predictions
Natural data streams often have overlapping and branching sequences.
The sequence memory therefore needs to make multiple predictions at the same time.
4)
Local learning rules
The sequence memory must only use learning rules that are local to each neuron.
The rules must be local in both space and time, without the need for a global objective function.
5)
Robustness
The memory should exhibit robustness to high levels of noise, loss of neurons, and natural variation in the input.
Degradation in performance under these conditions should be gradual.

All these properties must occur simultaneously in the context of continuously streaming data.


4




2.2.1.
Mini-columns and Neurons: Two Representations

High-order sequence memory requires two simultaneous representations.
One represents the feedforward input to the network and the other represents the feedforward input in a particular temporal context.
To illustrate this requirement, consider two abstract sequences “ABCD” and “XBCY”, where each letter represents a sparse pattern of activation in a population of neurons.
Once these sequences are learned the network should predict “D” when presented with sequence “ABC” and it should predict “Y” when presented with sequence “XBC”.
Therefore, the internal representation during the subsequence “BC” must be different in the two cases; otherwise the correct prediction can’t be made after “C” is presented.

671 :YAMAGUTIseisei:2019/07/28(日) 16:47:13.73 ID:WB6ziRYun
Fig. 2 illustrates how we propose these two representations are manifest in a cellular layer of cortical neurons.
The panels in Fig. 2 represent a slice through a single cellular layer in the neocortex (Fig. 2A).
The panels are greatly simplified for clarity.
Fig. 2B shows how the network represents two input sequences before the sequences are learned.
Fig. 2C shows how the network represents the same input after the sequences are learned.
Each feedforward input to the network is converted into a sparse set of active mini-columns.
(Mini-columns in the neocortex span multiple cellular layers.
Here we are only referring to the cells in a mini-column in one cellular layer.) All the neurons in a mini-column share the same feedforward receptive fields.
If an unanticipated input arrives, then all the cells in the selected mini-columns will recognize the input pattern and become active.
However, in the context of a previously learned sequence, one or more of the cells in the mini-columns will be depolarized.
The depolarized cells will be the first to generate an action potential, inhibiting the other cells nearby.
Thus a predicted input will lead to a very sparse pattern of cell activation that is unique to a particular element, at a particular location, in a particular sequence.

[Figure 2 about here see end of manuscript]

672 :YAMAGUTIseisei:2019/07/28(日) 16:53:49.06 ID:WB6ziRYun
2.2.2.
Basal Synapses Are the Basis of Sequence Memory

In this theory, cells use their basal synapses to learn the transitions between input patterns.
With each new feedforward input some cells become active via their proximal synapses.
Other cells, using their basal synapses, learn to recognize this active pattern and upon seeing the pattern again, become depolarized, thereby predicting their own feedforward activation in the next input.
Feedforward input activates cells, while basal input generates predictions.
As long as the next input matches the current prediction, the sequence continues, Fig. 3.
Fig. 3A shows both active cells and predicted cells while the network follows a previously learned sequence.

[Figure 3 about here see end of manuscript]

Often the network will make multiple simultaneous predictions.
For example, suppose that after learning the sequences “ABCD” and “XBCY” we expose the system to just the ambiguous sub-sequence “BC”.
In this case we want the system to simultaneously predict both “D” and “Y”.
Fig. 3B illustrates how the network makes multiple predictions when the input is ambiguous.
The number of simultaneous predictions that can be made with low chance of error can again be calculated via Eq.(1).
Because the predictions tend to be highly sparse, it is possible for a network to predict dozens of patterns simultaneously without confusion.
If an input matches any of the predictions it will result in the correct highly-sparse representation.
If an input does not match any of the predictions all the cells in a column will become active, indicating an unanticipated input.

673 :YAMAGUTIseisei:2019/07/28(日) 16:55:24.70 ID:WB6ziRYun
Although every cell in a mini-column shares the same feedforward response, their basal synapses recognize different patterns.
Therefore cells within a mini-column will respond uniquely in different learned temporal contexts, and overall levels of activity will be sparser when inputs are anticipated.
Both of these attributes have been observed (Martin and Schr er, 2013; Vinje and Gallant, 2002; Yen et al., 2007).

For one of the cells in the last panel of Fig. 3A, we show three connections the cell used to make a prediction.
In real neurons, and in our simulations, a cell would form 15 to 40 connections to a subset of a larger population of active cells.

2.2.3.
Apical Synapses Create a Top-Down Expectation

Feedback axons between neocortical regions often form synapses (in layer 1) with apical dendrites of pyramidal neurons whose cell bodies are in layers 2, 3, and 5.
It has long been speculated that these feedback connections implement some form of expectation or bias (Lamme et al., 1998).
Our neuron model suggests a mechanism for top-down expectation in the neocortex.
Fig. 4 shows how a stable feedback pattern to apical dendrites can predict multiple elements in a sequence all at the same time.
When a new feedforward input arrives it will be interpreted as part of the predicted sequence.
The feedback biases the input towards a particular interpretation.
Again, because the patterns are sparse, many patterns can be simultaneously predicted.

[Figure 4 about here see end of manuscript]

674 :YAMAGUTIseisei:2019/07/28(日) 17:02:38.66 ID:WB6ziRYun
Thus there are two types of prediction occurring at the same time.
Lateral connections to basal dendrites predict the next input, and top-down connections to apical dendrites predict multiple sequence elements simultaneously.
The physiological interaction between apical and basal dendrites is an area of active research (Larkum, 2013) and will likely lead to a more nuanced interpretation of their roles in inference and prediction.
However, we propose that the mechanisms shown in Figs.2, 3 and 4 are likely to continue to play a role in that final interpretation.

2.2.4.
Synaptic Learning Rule

Our neuron model requires two changes to the learning rules by which most neural models learn.
First, learning occurs by growing and removing synapses from a pool of “potential” synapses (Chklovskii et al., 2004).
Second, Hebbian learning and synaptic change occur at the level of the dendritic segment, not the entire neuron (Stuart and H龝sser, 2001).

Potential Synapses
For a neuron to recognize a pattern of activity it requires a set of co-located synapses (typically fifteen to twenty) that connect to a subset of the cells that are active in the pattern to be recognized.


5




Learning to recognize a new pattern is accomplished by the formation of a set of new synapses collocated on a dendritic segment.

675 :YAMAGUTIseisei:2019/07/28(日) 17:03:49.53 ID:WB6ziRYun
Learning to recognize a new pattern is accomplished by the formation of a set of new synapses collocated on a dendritic segment.



Figure 5 shows how we model the formation of new synapses in a simulated HTM neuron.
For each dendritic segment we maintain a set of “potential” synapses between the dendritic segment and other cells in the network that could potentially form a synapse with the segment (Chklovskii et al., 2004).
The number of potential synapses is larger than the number of actual synapses.
We assign each potential synapse a scalar value called “permanence” which represents stages of growth of the synapse.
A permanence value close to zero represents an axon and dendrite with the potential to form a synapse but that have not commenced growing one.
A 1.0 permanence value represents an axon and dendrite with a large fully formed synapse.

[Figure 5 about here see end of manuscript]

The permanence value is incremented and decremented using a Hebbian-like rule.
If the permanence value exceeds a threshold, such as 0.3, then the weight of the synapse is 1, if the permanence value is at or below the threshold then the weight of the synapse is 0.
The threshold represents the establishment of a synapse, albeit one that could easily disappear.
A synapse with a permanence value of 1.0 has the same effect as a synapse with a permanence value at threshold but is not as easily forgotten.
Using a scalar permanence value enables on-line learning in the presence of noise.
A previously unseen input pattern could be noise or it could be the start of a new trend that will repeat in the future.
By growing new synapses, the network can start to learn a new pattern when it is first encountered, but only act differently after several presentations of the new pattern.
Increasing permanence beyond the threshold means that patterns experienced more than others will take longer to forget.

676 :YAMAGUTIseisei:2019/07/28(日) 17:06:12.64 ID:WB6ziRYun
HTM neurons and HTM networks rely on distributed patterns of cell activity, thus the activation strength of any one neuron or synapse is not very important.
Therefore, in HTM simulations we model neuron activations and synapse weights with binary states.
Additionally, it is well known that biological synapses are stochastic (Faisal et al., 2008), so a neocortical theory cannot require precision of synaptic efficacy.
Although scalar states and weights might improve performance, they are not required from a theoretical point of view and all of our simulations have performed well without them.
The formal learning rules used in our HTM network simulations are presented in the Materials and Methods section.

3.
Simulation Results

Fig. 6 illustrates the performance of a network of HTM neurons implementing a high-order sequence memory.
The network used in Fig. 6 consists of 2048 mini-columns with 32 neurons per mini-column.
Each neuron has 128 basal dendritic segments, and each dendritic segment has up to 40 actual synapses.
Because this simulation is designed to only illustrate properties of sequence memory it does not include apical synapses.
The network exhibits all five of the desired properties for sequence memory listed earlier.

[Figure 6 about here see end of manuscript]

677 :YAMAGUTIseisei:2019/07/28(日) 17:11:21.75 ID:WB6ziRYun
Although we have applied HTM networks to many types of real-world data, in Fig. 6 we use an artificial data set to more clearly illustrate the network’s properties.
The input is a stream of elements, where every element is converted to a 2% sparse activation of mini-columns (40 active columns out of 2048 total).
The network learns a predictive model of the data based on observed transitions in the input stream.
In Fig. 6 the data stream fed to the network contains a mixture of random elements and repeated sequences.
The embedded sequences are six elements long and require high-order temporal context for full disambiguation and best prediction accuracy, e.g.“XABCDE” and “YABCFG”.
For this simulation we designed the input data stream such that the maximum possible average prediction accuracy is 50% and this is only achievable by using high-order representations.

Fig. 6A illustrates on-line learning and high-order predictions.
The prediction accuracy of the HTM network over time is shown in red.
The prediction accuracy starts at zero and increases as the network discovers the repeated temporal patterns mixed within the random transitions.
For comparison, the accuracy of a first-order network (created by using only one cell per column) is shown in blue.
After sufficient learning, the high-order HTM network achieves the maximum possible prediction accuracy of 50% whereas the first-order network only achieves about 33% accuracy.
After the networks reached their maximum performance the embedded sequences were modified.
The accuracy drops at that point, but since the network is continually learning it recovers by learning the new high-order patterns.

678 :YAMAGUTIseisei:2019/07/28(日) 17:12:10.25 ID:WB6ziRYun
Fig. 6B illustrates the robustness of the network.
After the network reached stable performance we inactivated a random selection of neurons.
At up to about 40% cell death there was minimal impact on performance.
This robustness is due to the noise tolerance described earlier that occurs when a dendritic segment forms more synapses than necessary to generate an NMDA spike.
At higher levels of cell death the network performance initially declines but then recovers as the network relearns the patterns using the remaining neurons.

4.
Discussion

We presented a model cortical neuron that is substantially different than model neurons used in most artificial neural networks.
The key feature of the model neuron is its use of active dendrites and thousands of synapses, allowing the neuron to recognize hundreds of unique patterns in large populations of cells.
We showed that a neuron can reliably recognize many patterns, even in the presence of large amounts of noise and variation.
In this model, proximal synapses define the feedforward receptive field of a cell.
The basal and apical synapses depolarize the cell, representing predictions.

We showed that a network of these neurons will learn a predictive model of a stream of data.
Basal synapses detect contextual patterns that predict the next feedforward input.
Apical synapses detect feedback patterns that predict entire sequences.
The operation of the neuron and the network rely on neural activity being sparse.
The sequence memory model learns continuously, uses variable amounts of context to make predictions, makes multiple simultaneous predictions, relies on local learning rules, and is robust to failure of network elements, noise, and variation.


6

679 :YAMAGUTIseisei:2019/07/28(日) 17:14:44.14 ID:WB6ziRYun
Although we refer to the network model as a “sequence memory”, it is actually a memory of transitions.
There is no representation or concept of the length of sequences or of the number of stored sequences.
The network only learns transitions between inputs.
Therefore, the capacity of a network is measured by how many transitions a given network can store.
This can be calculated as the product of the expected duty cycle of an individual neuron (cells per column/column sparsity) times the number of patterns each neuron can recognize on its basal dendrites.
For example, a network where 2% of the columns are active, each column has 32 cells, and each cell recognizes 200 patterns on its basal dendrites, can store approximately 320,000 transitions ((32/0.02)*200).
The capacity scales linearly with the number of cells per column and the number of patterns recognized by the basal synapses of each neuron.

Another important capacity metric is how many times a particular input can appear in different temporal contexts without confusion.
This is analogous to how many times a particular musical interval can appear in melodies without confusion, or how many times a particular word can be memorized in different sentences.
If mini-columns have 32 cells it doesn’t mean a particular pattern can have only 32 different representations.
For example, if we assume 40 active columns per input, 32 cells per column, and one active cell per column, then there are 3240 possible representations of each input pattern, a practically unlimited number.
Therefore, the practical limit is not representational but memory-based.
The capacity is determined by how many transitions can be learned with a particular sparse set of columns.

680 :YAMAGUTIseisei:2019/07/28(日) 17:15:42.13 ID:WB6ziRYun
So far we have only discussed cellular layers where all cells in the network can potentially connect to all other cells with equal likelihood.
This works well for small networks but not for large networks.
In the neocortex, it is well known that most regions have a topological organization.
For example cells in region V1 receive feedforward input from only a small part of the retina and receive lateral input only from a local area of V1.
HTM networks can be configured this way by arranging the columns in a 2D array and selecting the potential synapses for each dendrite using a 2D probability distribution centered on the neuron.
Topologically organized networks can be arbitrarily large.

There are several testable predictions that follow from this theory.
1)
The theory provides an algorithmic explanation for the experimentally observed phenomenon that overall cell activity becomes sparser during a continuous predictable sensory stream (Martin and Schr er, 2013; Vinje and Gallant, 2002; Yen et al., 2007).
In addition, it predicts that unanticipated inputs will result in higher cell activity, which should be correlated vertically within mini-columns.
Anticipated inputs on the other hand will result in activity that is uncorrelated within mini-columns.
It is worth noting that mini-columns are not a strict requirement of this theory.
The model only requires the presence of small groups of cells that share feedforward responses and that are mutually inhibitory.
We refer to these groups as mini-columns, but the columnar aspect is not a requirement, and the groupings could be independent of actual mini-columns.

681 :YAMAGUTIseisei:2019/07/28(日) 17:16:48.16 ID:WB6ziRYun
2)
A second core prediction of the theory is that the current pattern of cell activity contains information about past stimuli.
Early experimental results supporting this prediction have been reported in (Nikoli et al., 2009).
Further studies are required to validate the exact nature of dynamic cell activity and the role of temporal context in high order sequences.
3)
Synaptic plasticity should be localized to dendritic segments that have been depolarized via synaptic input followed a short time later by a back action potential.
This effect has been reported (Losonczy et al., 2008), though the phenomenon has yet to be widely established.
4)
There should be few, ideally only one, excitatory synapses formed between a given axon and a given dendritic segment.
If an excitatory axon made many synapses in close proximity onto a single dendrite then the presynaptic cell would dominate in causing an NMDA spike.
Two, three, or even four synapses from a single axon onto a single dendritic segment could be tolerated, but if axons routinely made more synapses to a single dendritic segment it would lead to errors.
Pure Hebbian learning would seem to encourage forming multiple synapses.
To prevent this from happening we predict the existence of a mechanism that actively discourages the formation of a multiple synapses after one has been established.
An axon can form synapses onto different dendritic segments of the same neuron without causing problems, therefore we predict this mechanism will be spatially localized within dendritic segments or to a local area of an axonal arbor.
5)
When a cell depolarized by an NMDA spike subsequently generates an action potential via proximal input, it needs to inhibit all other nearby excitatory cells.
This requires a fast, probably single spike, inhibition.
Fast-spiking basket inhibitory cells are the most likely source for this rapid inhibition (Hu et al., 2014).

682 :YAMAGUTIseisei:2019/07/28(日) 17:17:11.53 ID:WB6ziRYun
6)
All cells in a mini-column need to learn common feedforward responses.
This requires a mechanism to encourage all the cells in a mini-column to become active simultaneously while learning feedforward patterns.
This requirement for mutual excitation seems at odds with the prior requirement for mutual inhibition when one or more cells are slightly depolarized.
We don’t have a specific proposal for how these two requirements are met but we predict a mechanism where sometimes cells in a column are mutually excited and at other times they are mutually inhibited.

Pyramidal neurons are common in the hippocampus.
Hence, parts of our neuron and network models might apply to the hippocampus.
However, the hippocampus is known for fast learning, which is incompatible with growing new synapses, as synapse formation can take hours in an adult (Holtmaat and Svoboda, 2009; Knott et al., 2002; Niell et al., 2004; Trachtenberg et al., 2002).
Rapid learning could be achieved in our model if instead of growing new synapses, a cell had a multitude of inactive, or “silent” synapses (Kerchner and Nicoll, 2008).
Rapid learning would then occur by turning silent synapses into active synapses.
The downside of this approach is a cell would need many more synapses, which is metabolically expensive.


7




Pyramidal cells in hippocampal region CA2 have several times the number of synapses as pyramidal cells in neocortex (Meg s et al., 2001).
If most of these synapses were silent it would be evidence to suggest that region CA2 is also implementing a variant of our proposed sequence memory.

683 :YAMAGUTIseisei:2019/07/28(日) 17:19:19.66 ID:WB6ziRYun
It is instructive to compare our proposed biological sequence memory mechanism to other sequence memory techniques used in the field of machine learning.
The most common technique is Hidden Markov Models (HMMs) (Rabiner and Juang, 1986).
HMMs are widely applied, particularly in speech recognition.
The basic HMM is a first-order model and its accuracy would be similar to the first-order model shown in Fig. 6A.
Variations of HMMs can model restricted high order sequences by encoding high-order states by hand.
More recently, recurrent neural networks, specifically long short-term memory (LSTM) (Hochreiter and Schmidhuber, 1997), have become popular, often outperforming HMMs.
Unlike HTM networks, neither HMMs nor LSTMs attempt to model biology in any detail; as such they provide no insights into neuronal or neocortical functions.
The primary functional advantages of the HTM model over both these techniques are its ability to learn continuously, its superior robustness, and its ability to make multiple simultaneous predictions.
A more detailed comparison can be found in S1 Table.

684 :YAMAGUTIseisei:2019/07/28(日) 17:19:42.22 ID:WB6ziRYun
A number of papers have studied spiking neuron models (Ghosh-Dastidar and Adeli, 2009; Maass, 1997) in the context of sequences.
These models are more biophysically detailed than the neuron models used in the machine learning literature.
They show how spike-timing-dependent plasticity (STDP) can lead to a cell becoming responsive to a particular sequence of presynaptic spikes and to a specific time delay between each spike (Rao and Sejnowski, 2000; Ruf and Schmitt, 1997).
These models are at a lower level of detail than the HTM model proposed in this paper.
They explicitly model integration times of postsynaptic potentials and the corresponding time delays are typically sub-millisecond to a few milliseconds.
They also typically deal with a very small subset of the synapses and do not explicitly model non-linear active dendrites.
The focus of our work has been at a higher level.
The work presented in this paper is a model of the full set of synapses and active dendrites on a neuron, of a networked layer of such neurons and the emergence of a computationally sophisticated sequence memory.
An interesting direction for future research is to connect these two levels of modeling, i.e. to create biophysically detailed models that operate at the level of a complete layer of cells.
Some progress is reported in (Billaudelle and Ahmad, 2015), but there remains much to do on this front.

685 :YAMAGUTIseisei:2019/07/28(日) 17:20:31.99 ID:WB6ziRYun
A number of papers have studied spiking neuron models (Ghosh-Dastidar and Adeli, 2009; Maass, 1997) in the context A key consideration in learning algorithms is the issue of generalization, or the ability to robustly deal with novel patterns.
The sequence memory mechanism we have outlined learns by forming synapses to small samples of active neurons in streams of sparse patterns.
The properties of sparse representations naturally allow such a system to generalize.
Two randomly selected sparse patterns will have very little overlap.
Even a small overlap (such as 20%) is highly significant and implies that the representations share significant semantic meaning.
Dendritic thresholds are lower than the actual number of synapses on each segment, thus segments will recognize novel but semantically related patterns as similar.
The system will see similarity between different sequences and make novel predictions based on analogy.

Recently we showed that our sequence memory method can learn a predictive model of sensory-motor sequences (Cui et al., 2015).
We also see it is likely that cortical motor sequences are generated using a variation of the same network model.
Understanding how layers of cells can perform these different functions and how they work together is the focus of our current research.

5.
Materials and Methods

Here we formally describe the activation and learning rules for an HTM sequence memory network.
There are three basic aspects to the rules: initialization, computing cell states, and updating synapses on dendritic segments.
These steps are described below, along with notation and some implementation details.

686 :YAMAGUTIseisei:2019/07/28(日) 17:21:37.95 ID:WB6ziRYun
Notation:
Let N represent the number of mini-columns in the layer, M the number of cells per column, and NM the total number of cells in the layer.
Each cell can be in an active state, in a predictive (depolarized) state, or in a non-active state.
Each cell maintains a set of segments each with a number of synapses.
(In this figure we use the term “synapse” to refer to “potential synapses” as described in the body of the paper. Thus at any point in time some of the synapses will have a weight of 0 and some will have a weight of 1.)
At any time step t, the set of active cells is represented by the M×N binary matrix A_t , where a_t_ij is the activity of the i’th cell in the j’th column.
Similarly, the M×N binary matrix Π_t denotes cells in a predictive state at time t, where π_t_ij is the predictive state of the i’th cell in the j’th column.

Each cell is associated with a set of distal segments, D_ij , such that D_d_ij represents the d’th segment of the i’th cell in the j’th column.
Each distal segment contains a number of synapses, representing lateral connections from a subset of the other NM-1 cells.
Each synapse has an associated permanence value (see Supplemental Fig. 2).
Therefore, D_d_ij itself is also an M×N sparse matrix.
If there are s potential synapses associated with the segment, the matrix contains s non-zero elements representing permanence values.
A synapse is considered connected if its permanence value is above a connection threshold.
We use /D_d_ij to denote a binary matrix containing only the connected synapses.
1)
Initialization:
the network is initialized such that each segment contains a set of potential synapses (i.e. with non-zero permanence value) to a randomly chosen subset of cells in the layer.
The permanence values of these potential synapses are chosen randomly: initially some are connected (above threshold) and some are unconnected.

687 :YAMAGUTIseisei:2019/07/28(日) 17:22:33.83 ID:WB6ziRYun
2)
Computing cell states:
All the cells in a mini-column share the same feed forward receptive fields.
We assume that an inhibitory process has already selected a set of k columns that best match the current feed forward input pattern.
We denote this set as W_t .
The active state for each cell is calculated as follows:


8




(2)

!!!
1 if ∈ ! and !" = 1
! ! !!!
!" = 1 if ∈ and !" =0
!
0 otherwise

The first line will activate a cell in a winning column if it was previously in a predictive state.
If none of the cells in a winning column were in a predictive state, the second line will activate all cells in that column.
The predictive state for the current time step is then calculated as follows:

(3)

! 1 if ∃! ! !
!" >
!" = !
0 otherwise

688 :YAMAGUTIseisei:2019/07/28(日) 17:23:19.83 ID:WB6ziRYun
Threshold θ represents the NMDA spiking threshold and ο represents element-wise multiplication.
At a given point in time, if there are more than θ connected synapses with active presynaptic cells, then that segment will be active (generate an NMDA spike).
A cell will be depolarized if at least one segment is active.

3)
Updating segments and synapses:
the HTM synaptic plasticity rule is a Hebbian-like rule.
If a cell was correctly predicted (i.e. it was previously in a depolarized state and subsequently became active via feedforward input), we reinforce the dendritic segment that was active and caused the depolarization.
Specifically, we choose those segments D_d_ij such that:

(4)

!!! ∀!∈! ! !" > 0 and !!" !!!

The first term selects winning columns that contained correct predictions.
The second term selects those segments specifically responsible for the prediction.

If a winning column was unpredicted, we need to select one cell that will represent the context in the future if the current sequence transition repeats.
To do this we select the cell with the segment that was closest to being active, i.e. the segment that had the most input even though it was below threshold.
Let ・D_d_ij denote a binary matrix containing only the positive entries in D_d_ij .
We reinforce a segment where the following is true:

(5)

!!!
∀!∈! ! !" = 0 and
!
! !!!
!" = ! ( ! !!!
!" )
! !

689 :YAMAGUTIseisei:2019/07/28(日) 17:24:24.96 ID:WB6ziRYun
Reinforcing the above segments is straightforward: we wish to reward synapses with active presynaptic cells and punish synapses with inactive cells.
To do that we decrease all the permanence values by a small value p- and increase the permanence values corresponding to active presynaptic cells by a larger value p+ :

(6)

!!" = ! !!" !!! ! !!"

The above rules deal with cells that are currently active.
We also apply a very small decay to active segments of cells that did not become active.
This can happen if segments were mistakenly reinforced by chance:

(7)

!!" = !!" where
! !" = 0 and !!" !!!

The matrices ΔD_d_ij are added to the current matrices of permanence values at every time step.

Implementation details:
in our software implementation, we make some simplifying assumptions that greatly speed up simulation time for larger networks.
Instead of explicitly initializing a complete set of synapses across every segment and every cell, we greedily create segments on a random cell and initialize potential synapses on that segment by sampling from currently active cells.
This happens only when there is no match to any existing segment.
In our simulations N = 2048, M = 32, k = 40.
We typically connect between 20 and 40 synapses on a segment, and θ is around 15.
Permanence values vary from 0 to 1 with a connection threshold of 0.5.
p+ and p- are small values that are tuned based on the individual dataset but typically less than 0.1.
The full source code for the implementation is available on Github at http://github.com/numenta/nupic


9

690 :YAMAGUTIseisei:2019/07/28(日) 17:25:16.76 ID:WB6ziRYun
6.
REFERENCES

Ahmad, S., and Hawkins, J. (2015).
Properties of Sparse Distributed Representations and their Application to Hierarchical Temporal Memory.
arXiv:1503.07469 [q-bio.NC] Antic, S.

D., Zhou, W. L., Moore, A. R., Short, S. M., and Ikonomu, K. D. (2010).
The decade of the dendritic NMDA spike.
J. Neurosci. Res. 88, 2991-3001.
doi:10.1002/jnr.22444.

Billaudelle, S., and Ahmad, S. (2015).
Porting HTM Models to the Heidelberg Neuromorphic Computing Platform.
arXiv:1505.02142 [q-bio.NC]

Chklovskii, D. B., Mel, B. W., and Svoboda, K. (2004).
Cortical rewiring and information storage.
Nature 431, 782-8.
doi:10.1038/nature03012.

Cichon, J., and Gan, W.-B. (2015).
Branch-specific dendritic Ca2+ spikes cause persistent synaptic plasticity.
Nature 520, 180-5.

Cui, Y., Ahmad, S., Surpur, C., and Hawkins, J. (2015).
Maintaining stable perception during active exploration.
in Cosyne Abstracts (Salt Lake City).

Faisal, A. A., Selen, L. P. J., and Wolpert, D. M. (2008).
Noise in the nervous system.
Nat. Rev. Neurosci. 9, 292-303.

691 :YAMAGUTIseisei:2019/07/28(日) 17:26:06.42 ID:WB6ziRYun
Ghosh-Dastidar, S., and Adeli, H. (2009).
Spiking neural networks.
Int. J. Neural Syst.
19, 295-308.

Golding, N. L., Jung, H. Y., Mickus, T., and Spruston, N. (1999).
Dendritic calcium spike initiation and repolarization are controlled by distinct potassium channel subtypes in CA1 pyramidal neurons.
J. Neurosci. 19, 8789-98.

Hawkins, J., Ahmad, S., and Dubinsky, D. (2011).
Cortical learning algorithm and hierarchical temporal memory.
http://numenta.org/resources/HTM_CorticalLearningAlgorithms.pdf

Hawkins, J., and Blakeslee, S. (2004).
On Intelligence.
New York: Henry Holt and Company.

Hochreiter, S., and Schmidhuber, J. (1997).
Long Short-Term Memory.
Neural Comput. 9, 1735-1780.

Holtmaat, A., and Svoboda, K. (2009).
Experience-dependent structural synaptic plasticity in the mammalian brain.
Nat. Rev. Neurosci. 10, 647-58.

Hu, H., Gan, J., and Jonas, P. (2014).
Fast-spiking, parvalbumin+ GABAergic interneurons: From cellular design to microcircuit function.
Science (80-. ). 345, 1255263-1255263.

Kerchner, G. A., and Nicoll, R. A. (2008).
Silent synapses and the emergence of a postsynaptic mechanism for LTP.
Nat. Rev. Neurosci. 9, 813-25.

692 :YAMAGUTIseisei:2019/07/28(日) 17:27:08.93 ID:WB6ziRYun
Knott, G. W., Quairiaux, C., Genoud, C., and Welker, E. (2002).
Formation of dendritic spines with GABAergic synapses induced by whisker stimulation in adult mice.
Neuron 34, 265-73.

Lamme, V. A., Supèr, H., and Spekreijse, H. (1998).
Feedforward, horizontal, and feedback processing in the visual cortex.
Curr. Opin. Neurobiol. 8, 529-35.

Larkum, M. (2013).
A cellular mechanism for cortical associations: an organizing principle for the cerebral cortex.
Trends Neurosci. 36, 141-51.
doi:10.1016/j.tins.2012.11.006.

Larkum, M. E., and Nevian, T. (2008).
Synaptic clustering by dendritic signalling mechanisms.
Curr. Opin. Neurobiol. 18, 321- 31.
doi:10.1016/j.conb.2008.08.013.

Larkum, M. E., Nevian, T., Sandler, M., Polsky, A., and Schiller, J. (2009).
Synaptic integration in tuft dendrites of layer 5 pyramidal neurons: a new unifying principle.
Science 325, 756-760.
doi:10.1126/science.1171958.

Larkum, M. E., Zhu, J. J., and Sakmann, B. (1999).
A new cellular mechanism for coupling inputs arriving at different cortical layers.
Nature 398, 338-41.


10

693 :YAMAGUTIseisei:2019/07/28(日) 17:28:04.79 ID:WB6ziRYun
Lashley, K. (1951).
“The problem of serial order in behavior,”
in Cerebral mechanisms in behavior, ed.
L. Jeffress (New York: Wiley), 112-131.

LeCun, Y., Bengio, Y., and Hinton, G. (2015).
Deep learning.
Nature 521, 436-444.

Losonczy, A., Makara, J. K., and Magee, J. C. (2008).
Compartmentalized dendritic plasticity and input feature storage in neurons.
Nature 452, 436-41.
doi:10.1038/nature06725.

Maass, W. (1997).
Networks of spiking neurons: the third generation of neural network models.
Neural networks 10, 1659-1671.
doi:10.1016/S0893-6080(97)00011-7.

Major, G., Larkum, M. E., and Schiller, J. (2013).
Active properties of neocortical pyramidal neuron dendrites.
Annu. Rev. Neurosci. 36, 1-24.
doi:10.1146/annurev-neuro-062111-150343.

Major, G., Polsky, A., Denk, W., Schiller, J., and Tank, D. W. (2008).
Spatiotemporally graded NMDA spike/plateau potentials in basal dendrites of neocortical pyramidal neurons.
J. Neurophysiol. 99, 2584-601.

Martin, K. A. C., and Schr er, S. (2013).
Functional heterogeneity in neighboring neurons of cat primary visual cortex in response to both artificial and natural stimuli.
J. Neurosci. 33, 7325-44.

694 :YAMAGUTIseisei:2019/07/28(日) 17:29:58.73 ID:WB6ziRYun
Meg as, M., Emri, Z., Freund, T. F., and Guly s, A. I. (2001).
Total number and distribution of inhibitory and excitatory synapses on hippocampal CA1 pyramidal cells.
Neuroscience 102, 527-40.

Niell, C. M., Meyer, M. P., and Smith, S. J. (2004).
In vivo imaging of synapse formation on a growing dendritic arbor.
Nat. Neurosci. 7, 254-60.

Nikoli, D., H龝sler, S., Singer, W., and Maass, W. (2009).
Distributed fading memory for stimulus properties in the primary visual cortex.
PLoS Biol. 7, e1000260.

Olshausen, B. A., and Field, D. J. (2004).
Sparse coding of sensory inputs.
Curr. Opin. Neurobiol. 14, 481-487.
doi:10.1016/j.conb.2004.07.007.

Petreanu, L., Mao, T., Sternson, S. M., and Svoboda, K. (2009).
The subcellular organization of neocortical excitatory connections.
Nature 457, 1142-5.

Poirazi, P., Brannon, T., and Mel, B. W. (2003).
Pyramidal neuron as two-layer neural network.
Neuron 37, 989-99.

Polsky, A., Mel, B. W., and Schiller, J. (2004).
Computational subunits in thin dendrites of pyramidal cells.
Nat. Neurosci. 7, 621-7.
doi:10.1038/nn1253.

695 :YAMAGUTIseisei:2019/07/28(日) 17:30:55.67 ID:WB6ziRYun
Rabiner, L., and Juang, B. (1986).
An introduction to hidden Markov models.
IEEE ASSP Mag. 3, 4-16.

Rah, J.-C., Bas, E., Colonell, J., Mishchenko, Y., Karsh, B., Fetter, R. D., et al. (2013).
Thalamocortical input onto layer 5 pyramidal neurons measured using quantitative large-scale array tomography.
Front. Neural Circuits 7, 177.

Rao, R. P. N., and Sejnowski, T. J. (2000).
Predictive Sequence Learning in Recurrent Neocortical Circuits.
in in Advances in Neural Information Processing Systems 12, 164-170.
doi:10.1.1.45.9739.

Ruf, B., and Schmitt, M. (1997).
Learning temporally encoded patterns in networks of spiking neurons.
Neural Process. Lett., 9-18.
doi:10.1023/A:1009697008681.

Schiller, J., Major, G., Koester, H. J., and Schiller, Y. (2000).
NMDA spikes in basal dendrites of cortical pyramidal neurons.
Nature 404, 285-9.

Schiller, J., and Schiller, Y. (2001).
NMDA receptor-mediated dendritic spikes and coincident signal amplification.
Curr. Opin. Neurobiol. 11, 343-8.

Spruston, N. (2008).
Pyramidal neurons: dendritic structure and synaptic integration.
Nat. Rev. Neurosci. 9, 206-21.

696 :YAMAGUTIseisei:2019/07/28(日) 17:31:52.78 ID:WB6ziRYun
Stuart, G. J., and H龝sser, M. (2001).
Dendritic coincidence detection of EPSPs and action potentials.
Nat. Neurosci. 4, 63-71.


11




Trachtenberg, J. T., Chen, B. E., Knott, G. W., Feng, G., Sanes, J. R., Welker, E., et al. (2002).
Long-term in vivo imaging of experience-dependent synaptic plasticity in adult cortex.
Nature 420, 788-94.

Vinje, W. E., and Gallant, J. L. (2002).
Natural Stimulation of the Nonclassical Receptive Field Increases Information Transmission Efficiency in V1.
J. Neurosci. 22, 2904-2915.

Yen, S.-C., Baker, J., and Gray, C. M. (2007).
Heterogeneity in the responses of adjacent neurons to natural stimuli in cat striate cortex.
J. Neurophysiol. 97, 1326-1341.
doi:10.1167/7.9.326.

Yoshimura, Y., Sato, H., Imamura, K., and Watanabe, Y. (2000).
Properties of horizontal and vertical inputs to pyramidal cells in the superficial layers of the cat visual cortex.
J. Neurosci. 20, 1931-40.


12

697 :YAMAGUTIseisei:2019/07/28(日) 17:33:05.17 ID:WB6ziRYun
A

B
Feedback
Context
Feedforward

C
Feedback
Context
Feedforward


Figure 1: Comparison of neuron models.
A)
The neuron model used in most artificial neural networks has few synapses and no dendrites.
B)
A neocortical pyramidal neuron has thousands of excitatory synapses located on dendrites (inset).
The co-activation of a set of synapses on a dendritic segment will cause an NMDA spike and depolarization at the soma.
There are three sources of input to the cell.
The feedforward inputs (shown in green) which form synapses proximal to the soma, directly lead to action potentials.
NMDA spikes generated in the more distal basal and apical dendrites depolarize the soma but typically not sufficiently to generate a somatic action potential.
C)
An HTM model neuron models dendrites and NMDA spikes with an array of coincident detectors each with a set of synapses (only a few of each are shown).


13

698 :YAMAGUTIseisei:2019/07/28(日) 17:36:19.22 ID:WB6ziRYun
A
Cellular layers learn sequences

2/3 4 5 6


B
Before learning

A B C D
X B C Y

Same columns, but only one cell active per column.

C
After learning

A B' C' D'
X B'' C'' Y''


Figure 2: Representing sequences in cortical cellular layers.
A)
The neocortex is divided into cellular layers. The panels in this figure show part of one generic cellular layer.
For clarity, the panels only show 21 mini-columns with 6 cells per column.
B)
Input sequences ABCD and XBCY are not yet learned.
Each sequence element invokes a sparse set of mini-columns, only three in this illustration.
All the cells in a mini-column become active if the input is unexpected, which is the case prior to learning the sequences.
C)
After learning the two sequences, the inputs invoke the same mini-columns but only one cell is active in each column, labeled B’, B’’, C’, C’’, D’ and Y’’.
Because C’ and C’’ are unique, they can invoke the correct high-order prediction of either Y or D.

699 :YAMAGUTIseisei:2019/07/28(日) 17:39:19.04 ID:WB6ziRYun
14




A

Prediction of next input
A - input
B' - predicted
B - input
C' - predicted

B

Multiple simultaneous predictions
B - input
C' and C'' - predicted
C -input
D' and Y'' - predicted


Figure 3: Basal connections to nearby neurons predict the next input.
A)
Using one of the sequences from Fig. 2, both active cells (black) and depolarized/predicted cells (red) are shown. The first panel shows the unexpected input A, which leads to a prediction of the next input B’ (second panel).
If the subsequent input matches the prediction then only the depolarized cells will become active (third panel), which leads to a new prediction (fourth panel).
The lateral synaptic connections used by one of the predicted cells are shown in the rightmost panel.
In a realistic network every predicted cell would have 15 or more connections to a subset of a large population of active cells.
B)
Ambiguous sub-sequence “BC” (which is part of both ABCD and XBCY) is presented to the network. The first panel shows the unexpected input B, which leads to a prediction of both C’ and C’’.
The third panel shows the system after input C. Both sets of predicted cells become active, which leads to predicting both D and Y (fourth panel).
In complex data streams there are typically many simultaneous predictions.

700 :YAMAGUTIseisei:2019/07/28(日) 17:40:42.15 ID:WB6ziRYun
15




Apical dendrites

Feedback biases for sequence B' C' D'


Input C

Representation C'


Input Y

Does not match expectation


Figure 4: Feedback to apical dendrites predicts entire sequences.
This figure uses the same network and representations as Fig. 2.
Area labeled “apical dendrites” is equivalent to layer 1 in neocortex; the apical dendrites (not shown) from all the cells terminate here.
In the figure, the following assumptions have been made.
The network has previously learned the sequence ABCD as was illustrated in Fig. 2.
A constant feedback pattern was presented to the apical dendrites during the learned sequence, and the cells that participate in the sequence B’C’D’ have formed synapses on their apical dendrites to recognize the constant feedback pattern.

After the feedback connections have been learned, presentation of the feedback pattern to the apical dendrites is simultaneously recognized by all the cells that would be active sequentially in the sequence.
These cells, shown in red, become depolarized (left pane).
When a new feedforward input arrives it will lead to the sparse representation relevant to the predicted sequence (middle panel).
If a feedforward pattern cannot be interpreted as part of the expected sequence (right panel) then all cells in the selected columns become active indicative of an anomaly.
In this manner apical feedback biases the network to interpret any input as part of an expected sequence and detects if an input does not match any one of the elements in the expected sequence.

701 :YAMAGUTIseisei:2019/07/28(日) 17:41:56.63 ID:WB6ziRYun
16




Dendrite
Axon

0.0 0.3 1.0 Synapse `` permanence ''
0 1 Synapse weight

Figure 5: Learning by growing new synapses.
Learning in an HTM neuron is modeled by the growth of new synapses from a set of potential synapses.
A “permanence” value is assigned to each potential synapse and represents the growth of the synapse.
Learning occurs by incrementing or decrementing permanence values.
The synapse weight is a binary value set to 1 if the permanence is above a threshold.


17

702 :YAMAGUTIseisei:2019/07/28(日) 17:44:06.35 ID:WB6ziRYun
>>701
Synapse `` permanence ''
Synapse weight

A

Accuracy
60%
: First Order Model
: HTM Layer
30%
20%
10%
0%

0 2000 4000 6000
Sequence Elements


B

Accuracy
60%
: 40% CellDepth
: 50% CellDepth
: 60% CellDepth
: 75% CellDepth
10%
0%

0 2000 4000 6000
Sequence Elements

703 :YAMAGUTIseisei:2019/07/28(日) 17:47:35.05 ID:WB6ziRYun
Figure 6: Simulation results of the sequence memory network.
The input stream used for this figure contained high-order sequences mixed with random elements.
The maximum possible average prediction accuracy of this data stream is 50%.
A)
High-order on-line learning.
The red line shows the network learning and achieving maximum possible performance after about 2500 sequence elements.
At element 3000 the sequences in the data stream were changed.
Prediction accuracy drops and then recovers as the model learns the new temporal structure.
For comparison, the lower performance of a first-order network is shown in blue.
B)
Robustness of the network to damage.
After the network reached stable performance we inactivated a random selection of neurons.
At up to 40% cell death there is almost no impact on performance.
At greater than 40% cell death the performance of the network declines but then recovers as the network relearns using remaining neurons.


18




S1 Text.

Chance of Error When Recognizing Large Patterns with a Few Synapses Formula for calculating chance of error

A non-linear dendritic segment can robustly classify a pattern by sub-sampling (forming synapses to) a small number of cells from a large population.
Assuming a random distribution of patterns, the exact probability of a false match, following is given by the following equation:

704 :YAMAGUTIseisei:2019/08/03(土) 19:53:45.72 ID:ahZr93IIX
!!!! ! ! × ! ! ! !

=
cell population size
=
number of active cells
=
number of synapses on segment
=
NMDA spike threshold



Table A:
Chance of error due to sub-sampling

This table demonstrates the effect of sub-sampling on the probability of a false match using the above equation.
The chance of an error drops rapidly as the sampling size increases.
A small number of synapses is sufficient for reliable matching.


Probability of false match



!!"

6 8 10

= 200,000 = 2,000 =

9.9 × 10 9.8 × 10!!" 9.8 × 10!!"

705 :YAMAGUTIseisei:2019/08/03(土) 20:00:24.50 ID:ahZr93IIX
Table B:
Chance of error with addition of 50% noise immunity

This table demonstrates robustness to noise.
By forming more synapses than required for an NMDA spike, a neuron can be robust to large amounts of noise and pattern variation and still have low probability of a false match.
For example, with s = 2θ the system will be immune to 50% noise.
The chance of an error drops rapidly as increases; even with noise a small number of synapses is sufficient for reliable matching.


θ
s
Probability of false match

6 12 8.7 × 10^-10
8 16 1.2 × 10^-12
10 20 1.6 × 10^-15
12 24 2.3 × 10^-18



= 200,000 = 2,000


Table C:
Chance of error with addition of mixing synapses on a dendritic segment

This table demonstrates that mixing synapses for m different patterns on a single dendritic segment will still not cause unacceptable errors.
By setting s = 2mθ we can see how a segment can recognize m independent patterns and still be robust to 50% noise.
It is possible to get very high accuracy with larger m by using a slightly higher threshold.

706 :YAMAGUTIseisei:2019/08/03(土) 20:02:04.74 ID:ahZr93IIX
θ
m
s
Probability of false match

10 10 10 15

2 4 6 6

40 80 120 120

6.3 × 10!!" 8.5 × 10!! 4.2 × 10!! 1.7 × 10!!"

= 200,000
= 2,000


19

707 :YAMAGUTIseisei:2019/08/03(土) 20:02:37.21 ID:ahZr93IIX
High order sequences
HTM Yes
HMMs Limited
LSTM Yes

Discovers high order sequence structure
HTM Yes
HMMs No
LSTM Yes

Local learning rules
HTM Yes
HMMs No
LSTM No

Continuous learning
HTM Yes
HMMs No
LSTM No

Multiple simultaneous predictions
HTM Yes
HMMs No
LSTM No

Unsupervised learning
HTM Yes
HMMs Yes
LSTM No

708 :YAMAGUTIseisei:2019/08/03(土) 20:03:23.00 ID:ahZr93IIX
Robustness and fault tolerance
HTM Very high
HMMs No
LSTM Yes

Detailed mapping to neuroscience
HTM Yes
HMMs No
LSTM No

Probabilistic model
HTM No
HMMs Yes
LSTM No


S1 Table, Comparison of Common Sequence Memory Algorithms Table comparing two common sequence memory algorithms (HMM and LSTM) to proposed model (HTM).
* Although weight updated rules are local, LSTMs require computing a global error signal that is then back propagated.

20

709 :YAMAGUTIseisei:2019/08/11(日) 18:01:19.35 ID:TqTByUJCx
Universal Transformers http://arxiv-vanity.com/papers/1807.03819/
References
[1] Karim Ahmed, Nitish Shirish Keskar, and Richard Socher.
Weighted transformer network for machine translation.
arXiv preprint arXiv:1711.02132, 2017.
[2] Jimmy Lei Ba, Jamie Ryan Kiros, and Geoffrey E Hinton.
Layer normalization. arXiv preprint
arXiv:1607.06450, 2016.
[3] Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio.
Neural machine translation by jointly learning to align and translate.
CoRR, abs/1409.0473, 2014.
[4] Kyunghyun Cho, Bart van Merrienboer, Caglar Gulcehre, Fethi Bougares, Holger Schwenk, and Yoshua Bengio.
Learning phrase representations using RNN encoder-decoder for statistical machine translation.
CoRR, abs/1406.1078, 2014.
[5] Francois Chollet.
Xception: Deep learning with depthwise separable convolutions.
arXiv preprint arXiv:1610.02357, 2016.
[6] Zewei Chu, Hai Wang, Kevin Gimpel, and David McAllester.
Broad context language modeling as reading comprehension.
In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers, volume 2, pages 52–57, 2017.
[7] Bhuwan Dhingra, Qiao Jin, Zhilin Yang, William W Cohen, and Ruslan Salakhutdinov.
Neural models for reasoning over multiple mentions using coreference.
arXiv preprint arXiv:1804.05922, 2018.
[8] Bhuwan Dhingra, Zhilin Yang, William W Cohen, and Ruslan Salakhutdinov.
Linguistic knowledge as memory for recurrent neural networks.
arXiv preprint arXiv:1703.02620, 2017.
[9] Jonas Gehring, Michael Auli, David Grangier, Denis Yarats, and Yann N. Dauphin.
Convolutional sequence to sequence learning.
CoRR, abs/1705.03122, 2017.
[10] Edouard Grave, Armand Joulin, and Nicolas Usunier.
Improving neural language models with a continuous cache.
arXiv preprint arXiv:1612.04426, 2016.

710 :YAMAGUTIseisei:2019/08/11(日) 18:02:07.89 ID:TqTByUJCx
[11] Alex Graves.
Generating sequences with recurrent neural networks.
CoRR, abs/1308.0850, 2013.
[12] Alex Graves.
Adaptive computation time for recurrent neural networks.
arXiv preprint arXiv:1603.08983, 2016.
[13] Alex Graves, Greg Wayne, and Ivo Danihelka.
Neural turing machines.
CoRR, abs/1410.5401, 2014.
[14] Mikael Henaff, Jason Weston, Arthur Szlam, Antoine Bordes, and Yann LeCun.
Tracking the world state with recurrent entity networks.
arXiv preprint arXiv:1612.03969, 2016.
[15] Sepp Hochreiter, Yoshua Bengio, Paolo Frasconi, and Jürgen Schmidhuber.
Gradient flow in recurrent nets: the difficulty of learning long-term dependencies.
A Field Guide to Dynamical Recurrent Neural Networks, 2003.
[16] A. Joulin and T. Mikolov.
Inferring algorithmic patterns with stack-augmented recurrent nets.
In Advances in Neural Information Processing Systems, (NIPS), 2015.
[17] Łukasz Kaiser and Ilya Sutskever.
Neural GPUs learn algorithms.
In International Conference on Learning Representations (ICLR), 2016.
[18] Nikita Kitaev and Dan Klein.
Constituency parsing with a self-attentive encoder.
In Proceedings of ACL’18, 2018.
[19] Zhouhan Lin, Minwei Feng, Cicero Nogueira dos Santos, Mo Yu, Bing Xiang, Bowen Zhou, and Yoshua Bengio.
A structured self-attentive sentence embedding.
arXiv preprint arXiv:1703.03130, 2017.
[20] Tal Linzen, Emmanuel Dupoux, and Yoav Goldberg.
Assessing the ability of lstms to learn syntax-sensitive dependencies.
Transactions of the Association of Computational Linguistics, 4(1):521–535, 2016.

711 :YAMAGUTIseisei:2019/08/11(日) 18:05:21.30 ID:TqTByUJCx
[21] Denis Paperno, Germán Kruszewski, Angeliki Lazaridou, Ngoc Quan Pham, Raffaella Bernardi, Sandro Pezzelle, Marco Baroni, Gemma Boleda, and Raquel Fernandez.
The lambada dataset: Word prediction requiring a broad discourse context.
In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), volume 1, pages 1525–1534, 2016.
[22] Ankur Parikh, Oscar Täckström, Dipanjan Das, and Jakob Uszkoreit.
A decomposable attention model.
In Empirical Methods in Natural Language Processing, 2016.
[23] Jack Rae, Jonathan J Hunt, Ivo Danihelka, Timothy Harley, Andrew W Senior, Gregory Wayne, Alex Graves, and Tim Lillicrap.
Scaling memory-augmented neural networks with sparse reads and writes.
In Advances in Neural Information Processing Systems, pages 3621–3629, 2016.
[24] Minjoon Seo, Sewon Min, Ali Farhadi, and Hannaneh Hajishirzi.
Query-reduction networks for question answering.
arXiv preprint arXiv:1606.04582, 2016.
[25] Nitish Srivastava, Geoffrey E Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov.
Dropout: a simple way to prevent neural networks from overfitting.
Journal of Machine Learning Research, 15(1):1929–1958, 2014.
[26] Sainbayar Sukhbaatar, arthur szlam, Jason Weston, and Rob Fergus.
End-to-end memory networks.
In C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, and R. Garnett, editors, Advances in Neural Information Processing Systems 28, pages 2440–2448. Curran Associates, Inc., 2015.
[27] Ilya Sutskever, Oriol Vinyals, and Quoc V. Le.
Sequence to sequence learning with neural networks.
In Advances in Neural Information Processing Systems, pages 3104–3112, 2014.
[28] Ke Tran, Arianna Bisazza, and Christof Monz.
The importance of being recurrent for modeling hierarchical structure.
In Proceedings of NAACL’18, 2018.

712 :YAMAGUTIseisei:2019/08/11(日) 18:06:08.63 ID:TqTByUJCx
[29] Ashish Vaswani, Samy Bengio, Eugene Brevdo, Francois Chollet, Aidan N. Gomez, Stephan Gouws, Llion Jones, Łukasz Kaiser, Nal Kalchbrenner, Niki Parmar, Ryan Sepassi, Noam Shazeer, and Jakob Uszkoreit.
Tensor2tensor for neural machine translation.
CoRR, abs/1803.07416, 2018.
[30] Ashish Vaswani, Niki Parmar, Jakob Uszkoreit, Noam Shazeer, and Lukasz Kaiser.
Image transformer
, 2018.
[31] Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin.
Attention is all you need.
CoRR, 2017.
[32] Jason Weston, Antoine Bordes, Sumit Chopra, Alexander M Rush, Bart van Merriënboer, Armand Joulin, and Tomas Mikolov.
Towards ai-complete question answering: A set of prerequisite toy tasks.
arXiv preprint arXiv:1502.05698, 2015.
[33] Dani Yogatama, Yishu Miao, Gabor Melis, Wang Ling, Adhiguna Kuncoro, Chris Dyer, and Phil Blunsom.
Memory architectures in recurrent neural network language models.
In International Conference on Learning Representations, 2018.
[34] Wojciech Zaremba and Ilya Sutskever.
Learning to execute.
CoRR, abs/1410.4615, 2015.

713 :YAMAGUTIseisei:2019/08/11(日) 18:06:34.23 ID:TqTByUJCx
Figure 4: The Universal Transformer with position and step embeddings as well as dropout and layer normalization.
Appendix B
bAbI Detailed Results
Best seed run for each task (out of 10 runs)
Task id
10K
1K
train single , train joint , train single , train joint
1 0.0 0.0 0.0 0.0
2 0.0 0.0 0.0 0.5
3 0.4 1.2 3.7 5.4
4 0.0 0.0 0.0 0.0
5 0.0 0.0 0.0 0.5
6 0.0 0.0 0.0 0.5
7 0.0 0.0 0.0 3.2
8 0.0 0.0 0.0 1.6
9 0.0 0.0 0.0 0.2
10 0.0 0.0 0.0 0.4
11 0.0 0.0 0.0 0.1
12 0.0 0.0 0.0 0.0
13 0.0 0.0 0.0 0.6
14 0.0 0.0 0.0 3.8
15 0.0 0.0 0.0 5.9
16 0.4 1.2 5.8 15.4
17 0.6 0.2 32.1 43.2
18 0.0 0.0 0.0 4.1
19 2.8 3.1 47.2 69.11
20 0.0 0.0 2.4 2.4
avg err 0.21 0.29 4.56 7.85
failed 0 0 3 5
Average (±var) over all seeds (for 10 runs)
Task id 10K 1K

714 :YAMAGUTIseisei:2019/08/11(日) 18:07:04.04 ID:TqTByUJCx
train single
train joint
train single
train joint
1 0.0 ±0.0 0.0 ±0.0 0.2 ±0.3 0.1 ±0.2
2 0.2 ±0.4 1.7 ±2.6 3.2 ±4.1 4.3 ±11.6
3 1.8 ±1.8 4.6 ±7.3 9.1 ±12.7 14.3 ±18.1
4 0.1 ±0.1 0.2 ±0.1 0.3 ±0.3 0.4 ±0.6
5 0.2 ±0.3 0.8 ±0.5 1.1 ±1.3 4.3 ±5.6
6 0.1 ±0.2 0.1 ±0.2 1.2 ±2.1 0.8 ±0.4
7 0.3 ±0.5 1.1 ±1.5 0.0 ±0.0 4.1 ±2.9
8 0.3 ±0.2 0.5 ±1.1 0.1 ±0.2 3.9 ±4.2
9 0.0 ±0.0 0.0 ±0.0 0.1 ±0.1 0.3 ±0.3
10 0.1 ±0.2 0.5 ±0.4 0.7 ±0.8 1.3 ±1.6
11 0.0 ±0.0 0.1 ±0.1 0.4 ±0.8 0.3 ±0.9
12 0.2 ±0.1 0.4 ±0.4 0.6 ±0.9 0.3 ±0.4
13 0.2 ±0.5 0.3 ±0.4 0.8 ±0.9 1.1 ±0.9
14 1.8 ±2.6 1.3 ±1.6 0.1 ±0.2 4.7 ±5.2
15 2.1 ±3.4 1.6 ±2.8 0.3 ±0.5 10.3 ±8.6
16 1.9 ±2.2 0.9 ±1.3 9.1 ±8.1 34.1 ±22.8
17 1.6 ±0.8 1.4 ±3.4 44.7 ±16.6 51.1 ±12.3
18 0.3 ±0.4 0.7 ±1.4 2.3 ±3.6 12.8 ±9.0
19 3.4 ±4.0 6.1 ±7.3 50.2 ±8.4 73.1 ±23.9
20 0.0 ±0.0 0.0 ±0.0 3.2 ±2.5 2.6 ±2.8
avg 0.73 ±0.89 1.12 ±1.62 6.39 ±3.22 11.21 ±6.62

715 :YAMAGUTIseisei:2019/08/11(日) 18:07:28.73 ID:TqTByUJCx
Appendix C
bAbI Attention Visualization
We present visualization of the attention distributions on bAbI tasks for a couple of examples.
The visualization of attention weights is over different time steps based on different heads over all the facts in the story and a question.
Different color bars on the left side indicate attention weights based on different heads (4 heads in total).
An example from tasks 1: (requiring one supportive fact to solve)
Story:
John travelled to the hallway.
Mary journeyed to the bathroom.
Daniel went back to the bathroom.
John moved to the bedroom
Query:
Where is Mary?
Model’s output:
bathroom
(a) Step 1
(b) Step 2
(c) Step 3
(d) Step 4

716 :YAMAGUTIseisei:2019/08/11(日) 18:08:17.27 ID:TqTByUJCx
Figure 5:
Visualization of the attention distributions, when encoding the question: “Where is Mary?”.
An example from tasks 2: (requiring two supportive facts to solve)
Story:
Sandra journeyed to the hallway.
Mary went to the bathroom.
Mary took the apple there.
Mary dropped the apple.
Query:
Where is the apple?
Model’s output:
bathroom
(a) Step 1 (b) Step 2 (c) Step 3 (d) Step 4

Figure 6:
Visualization of the attention distributions, when encoding the question: “Where is the apple?”.
An example from tasks 2: (requiring two supportive facts to solve)
Story:
John went to the hallway.
John went back to the bathroom.
John grabbed the milk there.
Sandra went back to the office.
Sandra journeyed to the kitchen.
Sandra got the apple there.
Sandra dropped the apple there.
John dropped the milk.
Query:
Where is the milk?
Model’s output:
bathroom
(a) Step 1 (b) Step 2 (c) Step 3 (d) Step 4

717 :YAMAGUTIseisei:2019/08/11(日) 18:08:49.22 ID:TqTByUJCx
Figure 7:
Visualization of the attention distributions, when encoding the question: “Where is the milk?”.
An example from tasks 3: (requiring three supportive facts to solve)
Story:
Mary got the milk.
John moved to the bedroom.
Daniel journeyed to the office.
John grabbed the apple there.
John got the football.
John journeyed to the garden.
Mary left the milk.
John left the football.
Daniel moved to the garden.
Daniel grabbed the football.
Mary moved to the hallway.
Mary went to the kitchen.
John put down the apple there.
John picked up the apple.
Sandra moved to the hallway.
Daniel left the football there.
Daniel took the football.
John travelled to the kitchen.
Daniel dropped the football.
John dropped the apple.
John grabbed the apple.
John went to the office.
Sandra went back to the bedroom.
Sandra took the milk.
John journeyed to the bathroom.
John travelled to the office.
Sandra left the milk.
Mary went to the bedroom.

718 :YAMAGUTIseisei:2019/08/11(日) 18:09:46.92 ID:TqTByUJCx
Mary moved to the office.
John travelled to the hallway.
Sandra moved to the garden.
Mary moved to the kitchen.
Daniel took the football.
Mary journeyed to the bedroom.
Mary grabbed the milk there.
Mary discarded the milk.
John went to the garden.
John discarded the apple there.
Query:
Where was the apple before the bathroom?
Model’s output:
office
(a) Step 1
(b) Step 2
(c) Step 3
(d) Step 4

Figure 8:
Visualization of the attention distributions, when encoding the question: “Where was the apple before the bathroom?”.
Generated by LaTeXML [LOGO]
Want to hear about new tools we're making? Sign up to our mailing list for occasional updates.
Subscribe
Built by Andreas Jansson and Ben Firshman, with help from LaTeXML. Contribute on GitHub. Kindly sponsored by YLD.

719 :YAMAGUTIseisei:2019/08/13(火) 23:49:09.65 ID:RM77FnSdS
>>685
A key consideration in learning algorithms is the issue of generalization, or the ability to robustly deal with novel patterns.

720 :YAMAGUTIseisei:2019/08/14(水) 12:18:05.06 ID:SHJkZmNqv
>>686
Each distal segment contains a number of synapses, representing lateral connections from a subset of the other NM - 1 cells.

721 :YAMAGUTIseisei:2019/12/01(日) 01:14:16.63 ID:jGtwQP38C
http://www.expedient.com/blog/what-are-the-differences-between-backups-and-disaster-recovery/
What Are the Differences Between Backups and Disaster Recovery?

Author:
Erin Masterson
Category:
Disaster Recovery, Data Centers, Infrastructure Availability

Disaster recovery planning is an integral part of any business’s IT strategy, and is becoming more prevalent as security breaches and network outages have become common threats, and the cost of downtime has steadily increased.

In the beginning stages of disaster recovery planning, decision makers are often mistaken about what constitutes a disaster recovery plan.
Many times they are misled by the idea that data backup is sufficient precaution in the event of a disaster.

“Customers often come to us seeking disaster recovery services without realizing that simply backing up their data is not enough,” says Joe Palian, Regional Account Executive at Expedient.

While having a backup strategy is important, it is not the same as a disaster recovery strategy; rather, the beginning stages of establishing a proper DR plan.
A backup is a copy of your data; a disaster recovery plan is insurance that guarantees its recovery.

So, what makes backups and disaster recovery different?

722 :YAMAGUTIseisei:2019/12/01(日) 01:14:54.79 ID:jGtwQP38C
1.)
Data retention requirements
Backups are typically performed on a daily basis to ensure necessary data retention at a single location, for the single purpose of copying data.

Disaster recovery requires the determination of the RTO (recovery time objective) in order to designate the maximum amount of time the business can be without IT systems post-disaster.
Traditionally, the ability to meet a given RTO requires at least one duplicate of the IT infrastructure in a secondary location to allow for replication between the production and DR site.
2.)
Recovery ability
Disaster recovery is the process of failing over your primary environment to an alternate environment that is capable of sustaining your business continuity.

Backups are useful for immediate access in the event of the need to restore a document, but does not facilitate the failover of your total environment should your infrastructure become compromised.
They also do not include the physical resources required to bring them online.

723 :YAMAGUTIseisei:2019/12/01(日) 01:15:31.35 ID:jGtwQP38C
3.)
Additional resource needs
A backup is simply a copy of data intended to be restored to the original source.

DR requires a separate production environment where the data can live.
All aspects of the current environment should be considered, including physical resources, software, connectivity and security.
4.)
Planning process
Planning a backup routine is relatively simple, since typically the only goals are to meet the RPO (recovery point objective) and data retention requirements.

A complete disaster recovery strategy requires additional planning, including determining which systems are considered mission critical, creating a recovery order and communication process, and most importantly, a way to perform a valid test.

The overall benefits and importance of a DR plan are to mitigate risk and downtime, maintain compliance and avoid outages.
Backups serve a simpler purpose.
Make sure you know which solution makes sense for your business needs.

Looking to improve your DR preparedness? Follow these 6 Steps.

Have any questions for Erin Masterson?

724 :YAMAGUTIseisei:2019/12/01(日) 01:16:00.97 ID:jGtwQP38C
the cost of downtime
https://thecloudcalculator.com/calculators/cost-of-downtime/
often mistaken about what constitutes a disaster recovery plan
https://www.expedient.com/blog/the-differences-between-backups-and-disaster-recovery/
a disaster recovery strategy
https://www.expedient.com/blog/what-steps-have-you-left-out-of-your-dr-strategy/
Disaster recovery
https://www.expedient.com/services/managed-services/disaster-recovery/
the failover of your total environment
https://www.expedient.com/blog/expedient-push-button-dr-zertocon2018/
a way to perform a valid test
https://www.expedient.com/blog/with-push-button-dr-disaster-recovery-testing-doesnt-have-to-be-a-four-letter-word/
Follow these 6 Steps
http://bit.ly/1SFm5yp

IT Disaster Recovery Planning: What Hurricane Season Can Teach Los Angeles Companies
IT Disaster Recovery Planning:
What Hurricane Season Can Teach Los Angeles Companies
http://sugarshot.io/it-disaster-recovery-planning-what-hurricane-season-can-teach-los-angeles-companies/

725 :YAMAGUTIseisei:2019/12/08(日) 21:27:02.54 ID:a72oztLlg
>>721
Customers often come to us seeking disaster recovery services without realizing that simply backing up their data is not enough.

>>724
http://j.mp/1SFm5yp+# http://go.expedient.com/l/12902/2016-04-01/2c35f5/12902/127088/6_Steps_to_DR_Preparedness.pdf

726 :YAMAGUTIseisei:2020/03/18(水) 17:32:46.47 ID:jXSDxkt5c
http://pnas.org/content/early/2020/01/07/1910837117/# pnas.org/content/117/4/1853

A scalable pipeline for designing reconfigurable organisms
View ORCID ProfileSam Kriegman, Douglas Blackiston, Michael Levin, and Josh Bongard
PNAS first published January 13, 2020 http://doi.org/10.1073/pnas.1910837117

Sam Kriegman
aDepartment of Computer Science, University of Vermont, Burlington, VT 05405;

Douglas Blackiston
bDepartment of Biology, Tufts University, Medford, MA 02153;cAllen Discovery Center, Tufts University, Medford, MA 02153;

Michael Levin
bDepartment of Biology, Tufts University, Medford, MA 02153;cAllen Discovery Center, Tufts University, Medford, MA 02153;dWyss Institute for Biologically Inspired Engineering, Harvard University, Boston, MA 02115

Josh Bongard
aDepartment of Computer Science, University of Vermont, Burlington, VT 05405;


Phase offsets stored in the genotype were mutated by adding a number that was drawn randomly from a normal distribution with mean zero and SD s = 0.4π.

Also, contractile tissue incurs a much higher metabolic cost compared to nonmuscle tissue (the human heart consumes 1 mM ATP per second; ref. 31 ).
Also, contractile tissue incurs a much higher metabolic cost compared to nonmuscle tissue (the human heart consumes ∝1 mM ATP per second; ref. 31 ).

727 :YAMAGUTIseisei:2020/04/12(日) 14:34:40.36 ID:juW0pBg5d
Memory & Cognition
1974, Vol. 2, No. 3, 467-471


The influence of one memory retrieval on a subsequent memory retrieval *

GEOFFREY R. LOFTUS and ELIZABETH F. LOFTUS
University of Washington, Seattle, Washington 98195

*Requests for reprints may be sent to either Loftus, Department of Psychology, University of Washington, Seattle, Washington 98195.
The research was supported by a National Institute of Mental Health grant to E. Loftus and a National Science Foundation grant to G. Loftus.
Appreciation is expressed to Thomas 0. Nelson for his comments on the manuscript.


Ss produced an instance of a category and following zero or two intervening items produced a second instance of the same category.
The second instance was produced more quickly than the initial instance.
This finding, in conjunction with other data reported in the paper, indicate that the reduction in latency for the second instance is due mostly to a reduction in the rate with which the category is searched.

728 :YAMAGUTIseisei:2020/04/12(日) 14:35:58.23 ID:juW0pBg5d
In an experiment by Freedman and Loftus (1971), Ss were shown a noun category plus a restricting letter or adjective and were asked to name an instance of the category which began with the letter or which was characterized by the adjective.
Reaction time to produce the response was measured.
The data were discussed in terms of a model that postulated a hierarchical memory composed of noun categories (e.g., animals) with subsets (eg., birds, dogs) and supersets (e.g., living things) of each category.
Retrieval from this hierarchical structure was assumed to consist of at least two major steps: (1) entering the appropriate category and (2) searching the category for an appropriate member.
The times to execute Step 1 and Step 2 are hereafter denoted t1 and t2, respectively.
The duration of t1 was estimated to be about .25 sec by the following reasoning.
Ss saw stimuli presented with the category either first (e.g., fruit-P) or second (e.g., P-fruit) and with at least a 1/2-sec interval between the noun and restrictor.
Reaction times were measured from the presentation of the second member of the pair.
When the category came second, the total retrieval process began only after its presentation and included both t1 and t2, according to the model.
When the category came first, however, t1 could be completed before the restrictor was shown.
For example, given the stimulus fruit-P, the S could enter the category “fruits” during the interval.
Since measured reaction time begins when “P” is presented, measured reaction time excludes t1 in this case.
The decrease in reaction time when the category is shown first vs second can therefore be equated with t1, which is excluded in the former case and included in the latter.

729 :YAMAGUTIseisei:2020/04/12(日) 14:46:46.31 ID:juW0pBg5d
More recently, Loftus (1973) asked Ss to produce a member of a category and a short time later asked them to produce a different member of that category.
--
This was accomplished by showing a category-letter pair , which asked the S for an appropriate instance, then, following zero, or two intervening items, showing the same category paired with a different letter , which asked for a different instance.
This was accomplished by showing a AB pair (eg., cat-P), which asked the S for an appropriate instance, then, following zero, or two intervening items, showing the same BB paired with a different letter (e.g., cat-A), which asked for a different instance.
--
Interest centered around the question of whether the speed of retrieving the second instance of a category was affected by the retrieval of the first instance and/or the lag between the two retrievals.
The results indicated that response latency for the second instance was shorter than response latency for the first instance and increased monotonically with the number of intervening items.
For example, a S’s baseline time to name a fruit beginning with the letter “P” was 1.52 sec.
However, it took him 1.22 sec to produce the same response if he had named a different fruit on the previous trial and 1.29 sec to produce the response if he had named a different fruit two trials back.

The results of the Loftus (1973) study thus indicate that the process of retrieving information from a category facilitates a subsequent retrieval from that category.
However, in this experiment the S was presented with the category name and restricting letter simultaneously; retrieval time thus included both t1 and t2.
Consequently, the facilitation effect could have involved a reduction in t1 or t2 or both.
The present experiment is designed to distinguish among these three possibilities.

730 :YAMAGUTIseisei:2020/04/12(日) 15:03:05.89 ID:juW0pBg5d
--
This was accomplished by showing a category-letter pair (eg., fruit-P), which asked the S for an appropriate instance, then, following zero, one, or two intervening items, showing the same category paired with a different letter (e.g., fruit-A),
which asked for a different instance.
--

In some conditions of the present experiment, an interval was inserted between the category name and the letter and the stimuli were presented either in the order category-letter or in the order letter-category
[as in the Freedman & Loftus (1971) study].
As noted above, this procedure allows an estimation of t1 .
Additionally in the present experiment, the S was required to name an instance of a category and shortly thereafter was asked to name a second instance of the category [as in the Loftus (1973) study].
This design is sufficient to determine the locus of the reduction in reaction time to name a second category instance.

Figure 1 shows three possible patterns of results.
Suppose first that only category entry time, t1, is reduced when a second category instance is produced.
In this case, the results shown in Fig.1a should obtain: the letter-category conditions (which include t1) should depend on the prior retrieval, whereas the category-letter conditions (which exclude t1) should not.


467


--
In some conditions of the present experiment, an interval was inserted between the category name and the letter and the stimuli were presented either in the order A in the order B [as in the Freedman & Loftus (1971) study].

731 :YAMAGUTIseisei:2020/04/12(日) 15:04:33.20 ID:juW0pBg5d
468
LOFTUS AND LOFTUS


a

RT
* letter - category
* category - letter
0 2 initial
LAG


b

RT

0 2 initial
LAG


c

RT

0 2 initial
LAG


Fig.1.
Three possible patterns of results for the relationship between time and the number of intervening items (lag) between two appearances of a critical category.

732 :YAMAGUTIseisei:2020/04/12(日) 15:05:17.20 ID:juW0pBg5d
Conversely, suppose that only category search time, t2, is reduced when the second category instance is produced.
Such a situation would lead to the results shown in Fig.1b.
Both the category-letter and the letter-category conditions include t2, so they should be affected equally by the initial retrieval.

The final possibility is that both t1 and t2 are reduced.
This situation would predict the results shown in Fig.1c.
Here, the category-letter condition (which includes t2 but not t1) should be affected by the initial retrieval, but the letter-category condition (which includes both t, and t2) should be affected to a greater degree.

METHOD

Subjects
Eighteen Ss from the New School for Social Research received $5 for their participation in two 1-h sessions, which occurred on 2 consecutive days.
No S had previously participated in a memory experiment.

Materials
Each stimulus was printed in block letters on a 5 x 8 in. index card.
A stimulus always consisted of a category name plus a letter (e.g., fruit-P).
Eighty critical category names were selected from the Battig and Montague (1969) and Shapiro and Palermo (1970) category norms.
Each of the category names was paired with two different letters.
If “dominance" is defined as the frequency with which a word is given as an exemplar of a category, then one of the twu category-letter stimuli will be referred to as more dominant than the other.

In addition to the 160 critical stimuli (80 categories each paired with two letters), 80 filler stimuli were used.
The filler stimuli also consisted of a category plus a letter.
Some of the filler categories were used only once; others appeared twice with two different letters.
Thus, each S saw 240 unique stimuli (80 critical categories, each paired with two letters, plus 80 filler stimuli).

733 :YAMAGUTIseisei:2020/04/12(日) 15:05:56.33 ID:juW0pBg5d
Design
There were three within-S factors: order (category-letter vs letter-category), interval (simultaneous presentation of the stimuli vs 2.5-sec interval between the category name and the letter), and lag (Lag 0, Lag 2, and initial presentation).
These factors were combined factorially, thereby giving a 2 (orders) by 2 (intervals) by 3 (lags) by 18 (Ss) design.

Each S received a different permutation of the 240 items with the following restrictions:

(1)The initial presentation of a critical category-letter pair was followed after zero or two intervening filler items (i.e., at Lag 0 or at Lag 2) by the presentation of the same category paired with a different letter.
Each S received 40 stimuli presented at Lag 0 and 40 at Lag 2.

(2) On half of the trials, Ss saw the stimulus corresponding to the high dominant instance before seeing the stimulus corresponding to the low dominant instance.
For the remaining trials, the reverse arrangement held.
A given category was presented in the order dominant-nondominant for half the Ss and in the reverse order for the remaining half of the Ss.

734 :YAMAGUTIseisei:2020/04/12(日) 15:06:41.90 ID:juW0pBg5d
Procedure
Each S was told that he would see items consisting of categories and letters and that he was to respond with a word in the category that began with the given letter.
He was given examples and told to respond as quickly as possible, but to avoid errors.

The S sat in front of a screen with a window covered by half-silvered glass.
An index card containing the stimulus was placed in a dark enclosure behind the minor and was presented by illuminating the enclosure.
A microphone was placed in front of the S, and he responded by speaking into it.

A trial consisted of the following:
(a) a card with the item printed in large type was placed in the darkened enclosure;
(b) the E said “ready” and pressed a button which illuminated the first member of the stimulus pair;
(c) either simultaneously or after a 2.5-sec interval, the second member of the pair was automatically illuminated and an electric timer started;
(d) the S’s verbal response activated a voice key that stopped the timer and temrinated the trial.
A warm-up period of 20 trials preceded the experimental trials each day.

735 :YAMAGUTIseisei:2020/04/12(日) 15:07:21.47 ID:juW0pBg5d
RESULTS

Only correct responses (96%) to the critical stimuli were included in the following analyses.
Median latencies were obtained for each S’s responses in each of the 12 conditions.
For each condition, mean latencies were then obtained by averaging the medians from individual Ss; these means are plotted in Figs. 2 and 3.
Figure 2 shows the results when the 2.5-sec interval was inserted between the category and the letter.
In both the letter-category and category-letter conditions, a second instance of a category is produced faster than the first instance; furthermore, a second instance is produced faster at Lag 0 than at Lag 2.
Figure 3 indicates that the same pattern of results obtains when letter and noun are presented simultaneously.

A 2 (orders) by 2 (intervals) by 3 (lags) analysis of variance was done on the latency data.
Significant effects were found for lag [F(2,34) = 6.57, p < .05], category-letter order [F(1,17) = 14.71, p< .01],‘and interval [F(1,17) = 33.52, p <01].




THE INFLUENCE OF ONE MEMORY RETRIEVAL
469


None of the two-way or three-way interactions was significant (F < 1 for all cases).

736 :YAMAGUTIseisei:2020/04/12(日) 15:11:11.69 ID:juW0pBg5d
DISCUSSION

Dependence of Memory Retrievals
A number of studies have indicated that the time to retrieve information from a semantic category is decreased if that category has been accessed a short time previously.
Collins and Quillian (1970), for example, have shown that the time required to answer such questions as “Is a canary a bird?” is decreased by as much as 600 msec if information about canaries has been accessed on the previous trial.
Using a somewhat different paradigm, Meyer and Schvaneveldt (Meyer & Schvaneveldt, 1971; Meyer, Schvaneveldt, & Ruddy, 1972', Schvaneveldt & Meyer, 1973; Meyer, 1973) have shown the same thing.
In these experiments, Ss were required to classify letter strings as words or nonwords.
The general finding was that the reaction time to classify a letter string as a word is faster if the S has just classified 3 semantically similar word as opposed to a semantically dissimilar word.
Thus, for example, the time it takes to classify “butter” as a word is faster if “butter” is preceded by “bread” than if it is preceded by “nurse.”

Two general classes of models have been proposed to handle such results.
A location shifting model (Meyer & Schvaneveldt, 1971) assumes that when a S has finished processing a member of a particular category
An activation model, on the other hand, assumes that when items in a category are processed, other items are “excited” or “activated” to the extent that they are semantically similar to the information being processed.
Two further assumptions are made: first (Warren, 1970) that activation decays away over time and second that activated items are more readily accessible than nonactivated items.

--
A location shifting model (1971) assumes that when a S has finished processing a member of a particular category and must then shift to begin processing a second category, the shift time is dependent upon the semantic distance between the two categories.

737 :YAMAGUTIseisei:2020/04/12(日) 15:13:05.83 ID:juW0pBg5d
2.5 sec. inlerval

RT

1.90
: * Letter - Category
: * Category - Letter
1.60
1.50

0 2 Initial
LAG

Fig. 2.
Mean reaction time in seconds as a function of the number of intervening items (lag) between two appearances of a critical category. Items were presented with a 2.5-sec interval between the category and the letter.


simult.

RT

2.20
: * Letter - Category
: * Category - Letter
1.90
1.80

0 2 Initial
LAG

Fig. 3.
Mean reaction time in seconds as a function of the number of intervening items (lag) between two appearances of a critical category. The category and letter were presented simultaneously.

738 :YAMAGUTIseisei:2020/04/12(日) 15:13:51.71 ID:juW0pBg5d
The results of the present experiment together with the data of Meyer et al (1972) and Loftus (1973) disconfirm the location shifting model and support the activation model.
All of these experiments involve the following sorts of comparisons.
Let T represent target information whose time to be processed is the dependent variable of interest.
Let R represent information which is semantically related to T, and finally let U1 and U2 represent information which is semantically unrelated to T.
Now consider three conditions:

Condition a: Process U1 ; Process U1 ; Process T.
Condition b: Process R ; Process U2 ; Process T.
Condition c: Process U1 ; Process R ; Process T.

The data show that T is processed fastest in Condition c, next fastest in Condition b, and slowest in Condition a.
Both the location shifting model and the activation model correctly predict that reaction time in Condition c would be faster than reaction time in Conditions a and b.
However, the predictions of the two models differ with regard to the relationship between Conditions a and b.
A location shifting model incorrectly predicts that reaction time would be the same for Conditions a and b, since in both cases the S is shifting from the unrelated category, U2 to T.
An activation model, on the other hand, correctly predicts the obtained pattern of results.
This is because in Condition b, T is assumed to have been activated by R, and this activation has not decayed by the time T is processed.
In Condition a, on the Other hand, T is not assumed to have been activated at all; therefore, time to process T would be longer.

Processing Stages
In the outset of this report, it was noted that the semantic retrieval model proposed by Freedman and Loftus (1971) postulates two major processing stages: entering a category (which takes time t1) and searching the category (which takes time t2).

739 :YAMAGUTIseisei:2020/04/12(日) 15:14:42.33 ID:juW0pBg5d
470
LOFTUS AND LOFTUS


Table 1
Time Estimates (in Seconds) for Memory Retrieval Stages as a Function of Three Lag Conditions

Retrieval Stage
Lag Condition
Lag 0 Lag 2 Initial

t1
Category entry time
0.20 0.22 0.27

t2 + k
Category search time plus baseline
1.47 1.65 1.69

t3
Eye movement time
0.14 0.14 0.13

t4
Extra encoding time
0.21 0.16 0.22

740 :YAMAGUTIseisei:2020/04/12(日) 15:15:16.58 ID:juW0pBg5d
Another stage, taking time k, is a baseline stage, involving response execution, etc.
Unfortunately, these stages are not sufficient to handle the data from the present experiment.
To see why this is so, consider the reaction times to initially access a category.
These reaction times fall into a 2 by 2 design with order (category-letter vs letter-category) and interval (2.5 sec vs simultaneous) as factors.
According to the Freedman-Loftus model, the processing times involved in initial access should be as follows:

Condition 1, category-letter; interval: RT1= t2 + k
Condition 2, letter-category; interval: RT2 = t1 + t2 + k
Condition 3, category-letter; simultaneous: RT3 = t1 + t2 + k
Condition 4, letter-category; simultaneous: RT4 = t1 + t2 + k

Thus reaction times for Conditions 2-4 should be equal to each other and should differ (by t1) from the reaction time to Condition 1.
However, the data indicate that all four reaction times differ from one another, thereby necessitating the postulation of additional processing stages.
First, in Condition 4, the predisposition to encode the category before the letter may conflict with normal left-to-right reading habits.
Thus, an additional eye fixation could sometimes occur in Condition4 relative to the other three conditions.
We shall label the time for this additional eye fixation t3.
Secondly, when category and letter are presented simultaneously (Conditions 3 and 4), reaction time must include the time to encode both stimuli.
With a 2.5-sec interval, on the other hand (Conditions 1 and 2), reaction time includes the time to encode only one of the two stimuli.
Let the extra encoding time required in Conditions 3 and 4 be designated by t4.

We are now in a position to include the two new stages in the four initial reaction times.

741 :YAMAGUTIseisei:2020/04/12(日) 15:18:31.33 ID:juW0pBg5d
(1a) Category-letter; interval:
RT 1: t2 +k=1.69 sec
(1b) Letter-category; interval: RT2=t1+t2 +k=1.9ésec
(1c) Category-letter; simultaneous: RT3=t1 +152 +t4+k=2.18 sec
(1d) Letter-category; simultaneous: RT4 =t1 +132 +t3 +t4 +k=2.31 sec

By appropriate manipulations of Eqs. 1a-4a, we find that t1 = 0.27 sec (RT2 - RTI); (t2 + k) = 1.69 sec (RT1), t3 = 0.13 sec (RT4 - RT3); and t4 = 0.22 sec (RT3 - RT2 ).
The estimate of 0.27 sec for t1 (category entry time) coincides well with previous estimates obtained by Freedman and Loftus (1971) and Loftus and Freedman (1972).
The estimate of 0.22 sec for t4 (encoding time) is far greater than one would expect if “encoding” meant only the process of pattern-recognizing the visual stimulus (cf. Sperling, 1963, who estimated 10msec per item for the pattern-recognition process).
Thus the obtained estimate of 0.22 sec must include a great deal more processing, although it is impossible in the present experiment to determine what such encoding might consist of.
Finally, since an eye fixation usually lasts on the order of 200-300 msec, the estimate of 0.13 sec for t3 (extra fixation time) is somewhat less than one would expect.
A possible reason for this discrepancy is that additional eye fixations may not be made on all of the Condition 4 trials.
The notion of an extra eye fixation sometimes occurring in Condition 4 is, of course, easily testable.

One more parenthetical remark should be made.
As noted above, the interaction of interval time and category-letter order was not significant.
If the null hypothesis of no interaction is accepted, then inspection of Eqs. 1a-4a indicates that t1 = t3.
(This can be seen either by the fact that RT3 - RT1 = RT4 - RT2 or by the fact that RT2 - RT1 = RT4 - RT3, both of which are true under the null hypothesis.)
However, since nothing in the present experiment necessarily warrants acceptance of the null hypothesis, the equality of t1 and t3 should not be taken very seriously.

742 :YAMAGUTIseisei:2020/04/12(日) 15:22:46.15 ID:juW0pBg5d
What Stage Does Activation Affect?
Using the logic outlined above, it is possible to obtain estimates of t1, (t2 + k), t3, and t4 for second category presentations at Lags 0 and 2.
These estimates, along with the estimates given above for initial presentation, are shown in Table 1.
The statistical analyses of the data indicate that the only parameter which reliably changes over lag condition is t2 + k.
If we make the reasonable assumption that k remains constant over lag conditions, then t2, the category search time, constitutes the locus of the activation effect.
This finding agrees with the conclusion of Meyer (1973, p.30), who noted that “The semantic distance between categories...may affect the search rate for the second category.”

The invariance of encoding time (t4) over lag condition is somewhat at odds with the finding of Meyer et al (1972, Experiment 3) that encoding time appears to be shortened by prior processing of semantically similar information.
The reason for this discrepancy is not entirely clear.
A possible explanation may lie in the fact that the processing delay between the two categories was much shorter in the Meyer et al experiment than in the present experiment,
and the activation decay function for encoding time may be different from the analogous decay function for search rate.




THE INFLUENCE OF ONE MEMORY RETRIEVAL
47l

--
A possible explanation may lie in the fact that the processing delay between the two categories was shorter than the present experiment,and the activation decay function for encoding time may be different from the analogous decay function for search rate

743 :YAMAGUTIseisei:2020/04/12(日) 15:24:16.79 ID:juW0pBg5d
REFERENCES

Battig, W. F., & Montague, W. E.
Category norms for verbal items in 56 categories: A replication and extension of the Connecticut category norms.
Journal of Experimental Psychology Monograph,
1969. 80(3, Pt.2).

Collins, A. M., & Quillian, M. R.
Facilitating retrieval from semantic memory: The effect of repeating part of an inference.
In A. F. Sanders (Ed.), Attention and performance III.
Amsterdam: North-Holland, 1970.

Freedman, J. L., & Loftus, E. F.
Retrieval of words from long-term memory.
Journal of Verbal Learning & Verbal Behavior,
1971, 10, 107-115.

Loftus, E. F.
Activation of semantic memory.
American Journal of Psychology.
1974.
in press.

Loftus, E. F., & Freedman, J. L.
Effect of category-name frequency on the speed of naming an instance of the category.
Journal of Verbal Learning & Verbal Behavior,
1972, 11, 343-347.

Meyer, D. E.
Correlated operations in searching stored semantic categories.
Journal of Experimental Psychology,
1973, 99, 124-133.

744 :YAMAGUTIseisei:2020/04/12(日) 15:24:44.18 ID:juW0pBg5d
Meyer, D. E., & Schvaneveldt, R. W.
Facilitation in recognizing pairs of words: Evidence of a dependence between retrieval operations.
Journal of Experimental Psychology,
1971, 90, 227-234.

Meyer, D. E., Schvaneveldt, R. W., & Ruddy, M. G.
Activation of lexical memory.
Paper presented at the meeting of the Psychonomic Society.
St. Louis, November 1972.

Schvaneveldt, R. W., & Meyer, D. E.
Retrieval and comparison processes in semantic memory.
In S. Kornblum (Ed.), Attention and performance IV.
New York: Academic Press, 1973

Shapiro, S. I., & Palermo, D. S.
Conceptual organization and class membership: Normative data for representatives of 100 categories.
Psychonomic Monograph Supplements, 1970, 3(11, Whole No. 43).

Sperling, G.
A model for visual memory tasks.
Human Factors, 1963, 5, 19-31.

Warren, R. E.
Stimulus encoding and memory.
Unpublished doctoral dissertation, University of Oregon, 1970.


(Received for publication September 17, 1973; revision accepted December 6, 1973.)

745 :YAMAGUTIseisei:2020/06/19(金) 20:32:52.46 ID:ZY15djw41
>>727-744
http://link.springer.com/article/10.3758/BF03196906
http://link.springer.com/content/pdf/10.3758/BF03196906.pdf

 
>>732
If “dominance" is defined as the frequency with which a word is given as an exemplar of a category, then one of the two category-letter stimuli will be referred to as more dominant than the other.

>>741
By appropriate manipulations of Eqs 1a-4a, we find that t1 = 0.27 sec (RT2 - RTI); (t2 + k) = 1.69 sec (RT1), t3 = 0.13 sec (RT4 - RT3); and t4 = 0.22 sec (RT3 - RT2 ).

If the null hypothesis of no interaction is accepted, then inspection of Eqs 1a-4a indicates that t, = t3.

746 :YAMAGUTIseisei:2020/08/28(金) 00:26:14.31 ID:STE/0glun
http://nature.com/articles/s41598-020-58831-9
Thank you for visiting nature.com. You are using a browser version with limited support for CSS.
To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer).
In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.
:
* Article
* Open Access
* Published: 25 February 2020

Memristive synapses connect brain and silicon spiking neurons

* Alexantrou Serb1,
* Andrea Corna2,
* Richard George3,
* Ali Khiat1,
* Federico Rocchi2,
* Marco Reato2,
* Marta Maschietto2,
* Christian Mayr3,
* Giacomo Indiveri ORCID: orcid.org/0000-0002-7109-16894,
* Stefano Vassanelli ORCID: orcid.org/0000-0003-0389-80232 &
* Themistoklis Prodromakis ORCID: orcid.org/0000-0002-6267-69091

Scientific Reports volume 10, Article number: 2590 (2020) Cite this article
:
Subjects

* Bionanoelectronics
* Nanosensors

747 :YAMAGUTIseisei:2020/08/28(金) 00:34:57.19 ID:STE/0glun
Memristors
The memristive synapse set-up consisted of an array of memristive devices positioned inside an ArC memristor characterisation and testing instrument33 (Supplementary Fig.5. http:www.arc-instruments.co.uk).
The instrument is controlled by a PC, which handles all the communications over UDP; all through a python-based user interface.
The software is configured to react to UDP packets carrying information about the firing of either artificial or biological neurons (who fired when).
Once a packet is received,
the ID of the neuron that emitted it and the time of spiking are both retrieved from the packet payload and the neural connectivity matrix is consulted in order to determine which neurons are pre- and which are post-synaptic to the firing cell.
Then, if the plasticity conditions are met, the ArC instrument applies programming pulses that cause the memristive synapses to change their resistive states.
Importantly, the set-up can control whether LTP- or LTD-type plasticity is to be applied in each case, but once the pulses have been applied it is the device responses that determine the magnitude of the plasticity.
Notably, resistivity transitions of the device are non-volatile, they hold over at least hours27 as also exemplified in our prototype experiment and are therefore fully compatible with typical LTP and LTD time scales of natural synapses.
The system is sustained by a specific methodology for handling timing within the overall network (Zurich, Southampton, Padova).
The set-up in Southampton being the node that links Zurich and Padova together, controls the overall handling of time.

--
Once a packet is received, the ID of the neuron that emitted it and the time of spiking are retrieved from the neural connectivity matrix (held at the Southampton set-up) is consulted
the ID of the neuron that emitted it and the time of spiking are both retrieved from the packet payload and the neural connectivity matrix (held at the Southampton set-up) is consulted

748 :YAMAGUTIseisei:2020/08/28(金) 00:58:25.57 ID:STE/0glun
Under this system, one of the partners (in our case Zurich) is labelled as the “primary partner” and all timing information arriving from that partner is treated as a ground truth.
Every timing information sent by other partners then has to be related to this ground truth, for example if the primary partner says that neuron 12 fires a spike at time 305, then the secondary partner(s) is informed of this (through Southampton).
If then a neuron in the secondary partner set-up fires 5 time units (as measured by a wall-clock) after being informed of the firing of neuron 12, it emits a packet informing Southampton that e.g. neuron 55 fired at time 310.
This way the relative timing between spikes arriving from the primary partner and the spikes triggered by the secondary partner(s) in response is maintained despite any network delays.
The price is that if the secondary partners wish to communicate spikes to the primary partner, network delays for the entire round-trip are then burdening the secondary-to-primary pathway.
The details of timing control at each partner site are fairly complicated and constrained by the set-ups at each partner, but all timing information is eventually encoded in an “absolute time” record held at Southampton.
The rationale behind this design decision was to ensure that at least in the pathway from primary to secondary partner(s) timing control is sufficiently tight to sustain plasticity in the face of network delays.
Neuronal culture and electrophysiology
Embryonic (E18) rat hippocampal neurons were plated and cultured on the CMEA according to procedures described in detail in34.
Recordings were performed on 812 DIV neurons.
The experimental setup in UNIPD(Supplementary Fig.1)enabled UDP-triggered capacitive stimulation of neurons13 while simultaneously recording and communicating via UDP the occurrence of depolarisations that were measured by patch-clamp whole-cell recording

749 :YAMAGUTIseisei:2020/08/28(金) 01:40:48.13 ID:STE/0glun
The CMEA (20 × 20 independent TiO2 capacitors, each one of area 50 × 50 μm2) was controlled by a dedicated stimulation board and all the connections to partners, Southampton and Zurich, were managed by a PC running a LabVIEW-based software
(National Instruments Corp, Austin, TX, USA).
The stimulation protocol was derived from13 and further optimized for non-invasive adjustable stimulation of the neurons.
In brief, capacitive stimulation was adjusted to the memristor’s resistance (i.e. the synaptor weight) by varying the repetition number of appropriate stimulation waveforms (Supplementary Fig.1).
Patch-Clamp recordings were performed in whole-cell current-clamp configuration using an Axopatch 200B amplifier ( USA) connected to the PC through a BNC-2110 Shielded Connector Block ( TX, USA) along with a PCI-6259 PCI Card ( TX, USA).
WinWCP (Strathclyde Electrophysiology Software, University of Strathclyde, Glasgow, UK) was used for data acquisition.
Micropipettes were pulled from borosilicate glass capillaries (GB150T-10, Science Products GmbH, Hofheim, Germany) using a P-97 Flaming/Brown Micropipette Puller (Sutter Instruments Corp., Novato, CA, USA).
Intracellular pipette solution and extracellular solution used during the experiments were respectively (in mM): 6.0 KCl, 120 K gluconate, 10 HEPES, 3.0 EGTA, 5 MgATP, 20 Sucrose (K); 135.0 NaCl, 5.4 KCl, 1.0 MgCl2, 1.8 CaCl2, 10.0 Glucose, 5.0 HEPES (N).
Digitised recordings were analysed by a custom LabVIEW software running on the PC, allowing detection and discrimination of firing and EPSP activity through a thresholding approach.
All experiments were performed in accordance with the Italian and European legislation for the use of animals for scientific purposes and protocols approved by the ethical committee of the University of Padova and by the Italian Ministry of Health
(authorisation number 522/2018-PR).

--
Molecular Devices, USA
National Instruments Corp, Austin, TX, USA
adjusted to pH 7.3 with 1N KOH

750 :YAMAGUTIseisei:2020/08/28(金) 01:44:33.73 ID:STE/0glun
References

1.
O’Doherty, J. E. et al.
Active tactile exploration using a brain-machine-brain interface.
Nature 479, 228-231 (2011).
* ADS * Article * Google Scholar
2.
Hampson, R. E. et al.
Developing a hippocampal neural prosthetic to facilitate human memory encoding and recall.
J. Neural Eng. 15, 036014 (2018).
* ADS * Article * Google Scholar
3.
Thakor, N. V.
Translating the Brain-Machine Interface.
Sci. Transl. Med. 5, 210ps17-210ps17 (2013).
* Article * Google Scholar
4.
Mead, C. Neuromorphic electronic systems.
Proc. IEEE 78, 1629-1636 (1990).
* Article * Google Scholar
5.
Vassanelli, S. & Mahmud, M.
Trends and Challenges in Neuroengineering: Toward “Intelligent” Neuroprostheses through Brain-“Brain Inspired Systems” Communication.
Front. Neurosci. 10 (2016).
6.
Boi, F. et al. A Bidirectional Brain-Machine Interface Featuring a Neuromorphic Hardware Decoder.
Front. Neurosci. 10 (2016).


>>748
Recordings were performed on 8-12 DIV neurons.

751 :YAMAGUTIseisei:2020/08/28(金) 01:45:59.67 ID:STE/0glun
7.
Wei, S. L. et al.
Emulating long-term synaptic dynamics with memristive devices.
ArXiV. 1509, 01998 (2015).
* Google Scholar
8.
Berdan, R. et al.
Emulating short-term synaptic dynamics with memristive devices.
Scientific reports. 6 (2015).
9.
Burr, G. W. et al.
Experimental Demonstration and Tolerancing of a Large-Scale Neural Network (165 000 Synapses) Using Phase-Change Memory as the Synaptic Weight Element.
IEEE Trans. Electron Devices 62, 34983507 (2015).
* ADS * Article * Google Scholar
10.
Yang, J. J., Strukov, D. B. & Stewart, D. R.
Memristive devices for computing.
Nat. Nanotechnol. 8, 13-24 (2013).
* ADS * CAS * Article * Google Scholar
11.
Gupta, I. et al.
Real-time encoding and compression of neuronal spikes by metal-oxide memristors.
Nat. Commun. 7, 12805 (2016).
* ADS * CAS * Article * Google Scholar
12.
Birmingham, K. et al.
Bioelectronic medicines: a research roadmap.
Nat. Rev. Drug Discov. 13, 399-400 (2014).
* CAS * Article * Google Scholar

>>749
6.0 KCl, 120 K gluconate, 10 HEPES, 3.0 EGTA, 5 MgATP, 20 Sucrose (adjusted to pH 7.3 with 1N KOH); 135.0 NaCl, 5.4 KCl, 1.0 MgCl2, 1.8 CaCl2, 10.0 Glucose, 5.0 HEPES (adjusted to pH 7.4 with 1N NaOH).

752 :YAMAGUTIseisei:2020/08/28(金) 01:49:08.05 ID:STE/0glun
13.
Schoen, I. & Fromherz, P.
Extracellular Stimulation of Mammalian Neurons Through Repetitive Activation of Na+ Channels by Weak Capacitive Currents on a Silicon Chip.
J. Neurophysiol. 100, 346-357 (2008).
* Article * Google Scholar
14.
George, R., Mayr, C., Indiveri, G. & Vassanelli, S.
Event-based softcore processor in a biohybrid setup applied to structural plasticity.
In 2015 International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP) 1-4, https://doi.org/10.1109/EBCCSP.2015.7300664 (IEEE, 2015).
15.
Rast, A. D. et al.
A location-independent direct link neuromorphic interface.
In The 2013 International Joint Conference on Neural Networks (IJCNN) 1-8, https://doi.org/10.1109/IJCNN.2013.6706887 (IEEE, 2013).
16.
Keren, H., Partzsch, J., Marom, S. & Mayr, C. G.
A Biohybrid Setup for Coupling Biological and Neuromorphic Neural Networks.
Front. Neurosci. 13 (2019).
17.
Dudek, S. M. & Bear, M. F.
Homosynaptic long-term depression in area CA1 of hippocampus and effects of N-methyl-D-aspartate receptor blockade.
Proc. Natl. Acad. Sci. USA 89, 4363-4367 (1992).
* ADS * CAS * Article * Google Scholar
18.
Cooper, L. N. & Bear, M. F.
The BCM theory of synapse modification at 30: interaction of theory with experiment.
Nat. Rev. Neurosci. 13, 798-810 (2012).
* CAS * Article * Google Scholar

753 :YAMAGUTIseisei:2020/08/28(金) 01:51:23.51 ID:STE/0glun
19.
Vassanelli, S., Mahmud, M., Girardi, S. & Maschietto, M.
On the Way to Large-Scale and High-Resolution Brain-Chip Interfacing.
Cogn. Comput. 4, 71-81 (2012).
* Article * Google Scholar
20.
Giacomello, M. et al.
Stimulation of Ca2+ signals in neurons by electrically coupled electrolyte-oxide-semiconductor capacitors.
J. Neurosci. Methods 198, 1-7 (2011).
* CAS * Article * Google Scholar
21.
Spira, M. E. & Hai, A.
Multi-electrode array technologies for neuroscience and cardiology.
Nat. Nanotechnol. 8, 83 (2013).
* ADS * CAS * Article * Google Scholar
22.
Alivisatos, A. P. et al.
Nanotools for Neuroscience and Brain Activity Mapping.
ACS Nano 7, 1850-1866 (2013).
* CAS * Article * Google Scholar
23.
Angle, M. R., Cui, B. & Melosh, N. A.
Nanotechnology and neurophysiology.
Curr. Opin. Neurobiol. 32, 132-140 (2015).
* CAS * Article * Google Scholar
24.
Duan, X. & Lieber, C. M.
Nanoscience and the nano-bioelectronics frontier.
Nano Res. 8, 1-22 (2015).
* Article * Google Scholar

754 :YAMAGUTIseisei:2020/08/28(金) 01:52:26.41 ID:STE/0glun
25.
Brivio, S. et al.
Experimental study of gradual/abrupt dynamics of HfO2-based memristive devices.
Appl. Phys. Lett. 109, 133504 (2016).
* ADS * Article * Google Scholar
26.
Serrano-Gotarredona, T., Masquelier, T., Prodromakis, T., Indiveri, G. & Linares-Barranco, B.
STDP and STDP variations with memristors for spiking neuromorphic learning systems.
Front. Neurosci. 7 (2013).
27.
Serb, A. et al.
Unsupervised learning in probabilistic neural networks with multi-state metal-oxide memristive synapses.
Nat. Commun. 7 (2016).
28.
Qiao, N. et al.
A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses.
Front. Neurosci. 9, 141 (2015).
* Article * Google Scholar
29.
Boegerhausen, M., Suter, P. & Liu, S.-C.
Modeling Short-Term Synaptic Depression in Silicon. Neural Comput.
15, 331-348 (2003).
* Article * Google Scholar
30.
Mitra, S., Fusi, S. & Indiveri, G.
Real-Time Classification of Complex Patterns Using Spike-Based Learning in Neuromorphic VLSI.
IEEE Trans. Biomed. Circuits Syst. 3, 32-42 (2009).
* CAS * Article * Google Scholar

755 :YAMAGUTIseisei:2020/08/28(金) 01:54:02.52 ID:STE/0glun
31.
Livi, P. & Indiveri, G.
A current-mode conductance-based silicon neuron for address-event neuromorphic systems.
In 2009 IEEE International Symposium on Circuits and Systems 2898-2901 http://doi.org/10.1109/ISCAS.2009.5118408 (IEEE, 2009).
32.
Deiss, S., Douglas, R. & Whatley, A.
A pulse-coded communications infrastructure for neuromorphic systems.
Pulsed Neural Netw. 157-178 (1999).
33.
Berdan, R. et al.
A u-Controller-Based System for Interfacing Selectorless RRAM Crossbar Arrays.
IEEE Trans. Electron Devices 62, 2190-2196 (2015).
* ADS * CAS * Article * Google Scholar
34.
Antonucci, D. E., Lim, S. T., Vassanelli, S. & Trimmer, J. S.
Dynamic localization and clustering of dendritic Kv2.1 voltage-dependent potassium channels in developing hippocampal neurons.
Neuroscience 108, 69-81 (2001).
* CAS * Article * Google Scholar
35.
Indiveri, G. et al.
Neuromorphic silicon neuron circuits.
Front. Neurosci. 5, 73 (2011).
* PubMed * PubMed Central * Google Scholar
36.
Stathopoulos, S. et al.
Multibit memory operation of metal-oxide bi-layer memristors.
Sci. Rep. 7 (2017).

756 :YAMAGUTIseisei:2020/08/28(金) 01:54:56.67 ID:STE/0glun
Download references
Author information
Affiliations

1.
Centre for Electronics Frontiers, University of Southampton, Southampton, SO17 1BJ, UK
* Alexantrou Serb
* , Ali Khiat
* & Themistoklis Prodromakis
2.
Biomedical Sciences and Padua Neuroscience Center, University of Padova, Padova, 35131, Italy
* Andrea Corna
* , Federico Rocchi
* , Marco Reato
* , Marta Maschietto
* & Stefano Vassanelli
3.
Institute of Circuits and Systems, TU Dresden, Dresden, 01062, Germany
* Richard George
* & Christian Mayr
4.
Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, 8057, Switzerland
* Giacomo Indiveri

757 :YAMAGUTIseisei:2020/08/28(金) 01:55:43.29 ID:STE/0glun
Contributions
The experiments were jointly conceived by T.P., S.V. and G.I., who share senior authorship.
The experiments were jointly designed and ran by A.S., A.C., R.G., who are acknowledged as shared first authors.
A.K. manufactured the memristive devices.
FR and MR assisted with the biological system set-up and operation.
MM cultured neurons on chips.
C.M. provided valuable feedback and guidance during the write-up of the paper.
The paper was jointly written by all co-authors.

Corresponding authors
Correspondence to Stefano Vassanelli or Themistoklis Prodromakis.

 
Ethics declarations

Competing interests
The authors declare no competing interests.

Additional information

Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information
http://static-content.springer.com/esm/art%3A10.1038%2Fs41598-020-58831-9/MediaObjects/41598_2020_58831_MOESM1_ESM.pdf

758 :YAMAGUTIseisei:2020/08/28(金) 02:10:03.84 ID:STE/0glun
Rights and permissions

Open Access
This article is licensed under a Creative Commons Attribution 4.0 International License,
which permits use, sharing, adaptation, distribution and reproduction in any medium or format,as long as you give appropriate credit to the original author(s) and the source,provide a link to the Creative Commons license,and indicate if changes were made.
The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material.
If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and Permissions

 
About this article

Verify currency and authenticity via CrossMark

Cite this article
Serb, A., Corna, A., George, R. et al. Memristive synapses connect brain and silicon spiking neurons. Sci Rep 10, 2590 (2020). https://doi.org/10.1038/s41598-020-58831-9
Download citation

* Received: 22 October 2019
* Accepted: 21 January 2020
* Published: 25 February 2020
* DOI: http://doi.org/10.1038/s41598-020-58831-9

 
Provided by the Springer Nature SharedIt content-sharing initiative

--
This article is licensed under a Creative Commons Attribution 4.0 License, which permits use, sharing and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) , and indicate if changes .

759 :YAMAGUTIseisei:2021/09/07(火) 11:21:59.91 ID:Sg5KSVwHZ
sage

760 :オーバーテクナナシー:2021/09/14(火) 07:52:03.64 ID:lSdSBXgiV
UNIVERSAL TRANSFORMERS. Published as a conference paper at ICLR 2019. http://arxiv-vanity.com/papers/1807.03819v3#15/# http://arxiv.org/abs/1807.03819v3#15

Mostafa Dehghani* † Stephan Gouws* Oriol Vinyals
University of Amsterdam DeepMind DeepMind
dehghani@uva.nl sgouws@google.com vinyals@google.com

Jakob Uszkoreit ukasz Kaiser
Google Brain Google Brain
usz@google.com lukaszkaiser@google.com

D.4. LEARNING TO EXECUTE (LTE).
LTE is a set of tasks indicating the ability of a model to learn to execute computer programs and was proposed by Zaremba & Sutskever (2015).
These tasks include two subsets:
1) program evaluation tasks (program, control, and addition) that are designed to assess the ability of models for understanding numerical operations, if-statements, variable assignments, the compositionality of operations, and more, as well as
2) memorization tasks (copy, double, and reverse).

The difficulty of the program evaluation tasks is parameterized by their length and nesting.
The length parameter is the number of digits in the integers that appear in the programs (so the integers are chosen uniformly from [1, length]), and the nesting parameter is the number of times we are allowed to combine the operations with each other.
Higher values of nesting yield programs with deeper parse trees.
For instance, here is a program that is generated with length = 4 and nesting = 3.
Input:
j=8584
for x in range(8):
j+=920
b=(1500+j)
print((b+7567))
Target:
25011
1) program evaluation tasks (A) that are designed to assess the ability of models for understanding numerical operations, if-statements, variable assignments, the compositionality of operations, and more, as well as 2) memorization tasks (B).

761 :オーバーテクナナシー:2021/09/14(火) 08:02:31.32 ID:lSdSBXgiV
>>760
ukasz Kaiser.

--
Input:
    j=8584
    for x in range(8):
     j+=920
    b=(1500+j)
    print((b+7567))
Target:
    25011

762 :YAMAGUTIseisei:2022/05/29(日) 03:53:57.30 ID:npUmdHxq/
http://webcache.googleusercontent.com/search?q=cache:www.lst.ethz.ch/research/publications/WCSA_2008/WCSA_2008.pdf
This is the html version of the file http://www.lst.ethz.ch/research/publications/WCSA_2008/WCSA_2008.pdf.
Google automatically generates html versions of documents as we crawl the web.

Page 1

 
CellVM: A Homogeneous Virtual Machine Runtime System for a Heterogeneous Single-Chip Multiprocessor

 
Albert Noll ETH Zurich albert.nollaTinf.ethz ch
Andreas Gal University of California, Irvine galATuci edu
Michael Franz University of California, Irvine franzATuci edu

 

The assign and method benchmark, on the other hand, include CellVM’s worst case scenario: synchronized methods and data structures.

Figure 4.
Performance evaluation of low-level VM operations.
Values are normalized to JamVM running on the PPE

763 :オーバーテクナナシー:2023/05/17(水) 11:16:29.45 ID:O3RhOyekU
要するに少孑化対策ってのは本来て゛あれば孑なんか産んた゛ら遺棄罪て゛逮捕懲役にされるへ゛き貧乏人に孑を産ませようという遺棄の幇助だろ
男は6Ο代て゛も妊孕能あるか゛女はз○才て゛妊娠困難.ひと昔前なら女学校時代に孑を産んだり.許嫁か゛いたり.行き遅れとか言われたりと
女性の特性に合致した社會風土によって多くの孑が作られていたわけだが、そんな大事な時期を資本家階級の家畜にする目的て゛.洗脳して
竒妙な社会的圧迫を加えて子を産めなくしてるのか゛最大原因た゛ろうに更に子供手当た゛のと憲法の下の平等すら無視した社會的歪みを加えて
余裕か゛あるから孑を作るという真っ当な家庭の子か゛10万のiPhone(笑)とか持ってて.私は買ってもらえないから始まって,公明党斉藤鉄夫
国土破壞省によるクソ航空騷音に勉強妨害されて精神的圧迫されて、路上で幸せそうな親子に斬りつけた不幸JСみたいのを増やそうとか
奨学金カ゛─なんてミ二ハ゛ン飲洒運転して事故って死んて゛る某大生とか典型た゛か゛そいつらに遊ぶ金くれてやることに何の意味があるってんた゛か
やることなすことすべてか゛てめえの私利私欲のために賄賂癒着してる資本家階級の利益を追求してるた゛けなのか゛ハ゛レハ゛レて゛キモチワルイな

創価学會員は.何百万人も殺傷して損害を与えて私腹を肥やし続けて逮捕者まで出てる世界最悪の殺人腐敗組織公明党を
池田センセ━か゛囗をきけて容認するとか本気で思ってるとしたら侮辱にもほと゛か゛あるぞ!
https://i.imgur.com/hnli1ga.jpeg

764 :オーバーテクナナシー:2023/07/31(月) 07:30:53.81 ID:b2DRs8Y/9
土砂崩れに洪水に灼熱地獄にと殺されまくって損害を受けてるやつらは.テ囗組織自民公明を恨んて゛憎んて゛呪って戦えよ,たた゛の自然災害て゛
運が惡かっただけとか思い込んでる能天氣な被災者に、憲法13条25条29条と公然と違反しながら.カによる─方的な現状変更によって
都心まで数珠つなき゛て゛鉄道の3〇倍以上もの温室効果カ゛スをまき散らす斎藤鉄夫ら世界最悪の殺人テロ組織國土破壞省か゛JALた゛のANΑた゛の
クソアヰ又ト゛ゥた゛のテロリス├と天下り賄賂癒着して私利私欲のために莫大な温室効果ガスまき散らして気侯変動させて海水温上昇させて
かつてない量の水蒸氣を日本列島に供給させているのか゛原因て゛あって曰本と゛ころか世界中て゛災害連發、クソ航空機は核以上に廃絶すへ゛き絶対悪
という強い認識と行動を教えよう!自民公明に入れたバ力の家や農作物が流されて殺されて無━文になるのは当然の報いて゛望ましいことた゛が,
年々気候変動による被災地は国内て゛も拡大している中、次こそ殺されるという正しい理解とともにこの強盗殺人腐敗テ口政府に立ち向かおう!
破防法を適用すべきクソ航空関係者と國土破壊省のテ囗リストと゛もを皆殺しにすることは.正当防衛かつ緊急避難として合法かつ正当な権利な

創価学會員は.何百万人も殺傷して損害を与えて私腹を肥やし続けて逮捕者まて゛出てる世界最惡の殺人腐敗組織公明党を
池田センセ―が囗をきけて容認するとか本氣で思ってるとしたら侮辱にもほと゛があるそ゛!
hΤтρs://i.imgur、cοm/hnli1ga.jpeg

765 :オーバーテクナナシー:2024/01/16(火) 18:40:25.80 ID:IM8+CJJfv
疲弊してるのは分からんでもないが大川原化工機社長の「できれば謝罪して欲しい』は残念だな
関東全域毎日グルク゛ル何台ものクソヘリ飛ばしまくって望遠カメラで女風呂やらのぞき見して遊び倒して莫大な温室効果ガスまき散らして
気侯変動させて洪水、土砂崩れ、暴風、熱中症,大雪にと災害連発させて住民の生命と財産を破壊して騒音まき散らして威カ業務妨害して
子の学習環境まで破壊しながら暇すき゛るしお前らとっとと犯罪おかせやと住民イライラ犯罪惹起してる上に捏造逮捕までするデタラメ腐敗集団
警視庁や東京地検、共謀した經産省の外道公務員個人に賠償金を求償するのは当然、しかも勾留中に死亡してんだから同じ期間勾留した上に
殺人罪適用して━生かけて償わせて害悪でしかない警視庁解体に向けて運動を繰り広げよう!
5億円もの裏金發覚した腐敗政党自民党は腐敗の隠蔽のために国民の血税をクソ公務員利権に費やしてきたツケが出てる現実を認識しろよ
公務員も原発も制御しきれる悪魔ではないわけだか゛自閉隊利権まで倍増させて、すでに傀儡状態だが名実ともに統治権まで奪われるわ
(ref.) tTрs://www.сall4.jp/info.php?tУpe=items&id=I0000062
ttPs://haneda-project.jimdofree.com/ , Тtps://flighT-route.com/
TTPs://n-souonhigaisosУoudan.amebaownd.com/

992 KB
新着レスの表示

掲示板に戻る 全部 前100 次100 最新50
名前: E-mail (省略可) :

read.cgi ver 2014.07.20.01.SC 2014/07/20 D ★