原因は以下だった
is = new FileInputStream(new File(path)); ↓ is = this.getClass().getClassLoader().getResourceAsStream(path);としプロパティファイルを「src」直下に配置する
Androidでnew Fileから読み込むと自動的に先頭に「/」が付いてしまっているようでうまくプロパティファイルが読み込めていなかった
is = new FileInputStream(new File(path)); ↓ is = this.getClass().getClassLoader().getResourceAsStream(path);としプロパティファイルを「src」直下に配置する
at com.google.android.gcm.GCMRegistrar.checkDevice(GCMRegistrar.java:98)実機にアプリを直接インストールして確かめる必要があります
eclipse -> プロジェクトを右クリック -> Export -> Export Android Application -> デスクトップに保存 -> apkファイルをGmailに添付して送信 -> Android端末で送信したGmailを受信 -> apkファイルをクリック -> アプリをインストール
C:\myinstallprogram\adt-bundle-windows-x86_64-20131030\sdk\extras\google\usb_driver先ほどGoogle USB Driverをインストールしたの以下のパスが存在しています(インストールしていない状態だとパスが存在しません)
db.hogeCollection.update({"_id":"abcdefg"},{$set:{"number":NumberInt(1000000000)}}, false, true);
mysql -u [username] -p [dbname] -h [rdb_ip_address][username], [dbname], [rdb_ip_address] の部分にはDBサーバー作成時に設定したユーザ名、DBの名前、DBサーバーのIPアドレスを入力します
DBUser=[username] DBHost=[rdb_ip_address] DBPassword=DBサーバー作成時に設定したパスワード
$DB["USER"] = '[username]'; $DB["SERVER"] = '[rdb_ip_address]'; $DB["PASSWORD"] = 'DBサーバー作成時に設定したパスワード';
<source>
type tail
format syslog
path /var/log/messages
tag log.messages
</source>
<match log.messages>
type forward
flush_interval 5s
<server>
name server_host_name
host server_host_name
port 24224
weight 60
</server>
</match>
<source>
type forward
port 24224
</source>
<match log.messages>
index_name messages
logstash_format true
type elasticsearch
host server_host_name
port 9200
include_tag_key true
tag_key _key
flush_interval 10s
</match>
include_tag_key を true にするとelasticsearchにログを突っ込む際にmatchに合致したタグ情報を自動で付与してくれますformat /^(?<timestamp>\w{3} \d{2} \d{2}:\d{2}:\d{2}) (?<host>[^ ]*) (?<body>.*)$/こんな感じで自分でパースしてelasticsearchに渡すとうまくtimestampをパースしてくれません
curl -X POST 'http://server_host_name:9200/messages'サーバ側のfluent.confに記載してあるindex_nameと同じindexを作成してください
curl -X PUT http://server_host_name:9200/messages/fluentd/_mapping -d '{ "app_log" : { "properties" : { "host" : {"type" : "string"}, "body" : {"type" : "string"}, "timestamp": {"format" : "MMM dd HH:mm:ss","type" : "date", "locale" : "ja_JP"} } } }'
{ "require": { "aws/aws-sdk-php": "2.*" } }
WARNING: channel "pear.php.net" has updated its protocols, use "pear channel-update pear.php.net" to update Loading composer repositories with package information Installing dependencies (including require-dev) - Installing symfony/event-dispatcher (v2.4.0) Downloading: 100% - Installing guzzle/guzzle (v3.7.4) Downloading: 100% - Installing aws/aws-sdk-php (2.4.11) Downloading: 100% symfony/event-dispatcher suggests installing symfony/dependency-injection () symfony/event-dispatcher suggests installing symfony/http-kernel () aws/aws-sdk-php suggests installing doctrine/cache (Adds support for caching of credentials and responses) aws/aws-sdk-php suggests installing ext-apc (Allows service description opcode caching, request and response caching, and credentials caching) aws/aws-sdk-php suggests installing monolog/monolog (Adds support for logging HTTP requests and responses) aws/aws-sdk-php suggests installing symfony/yaml (Eases the ability to write manifests for creating jobs in AWS Import/Export) Writing lock file
<?php require 'vendor/autoload.php'; use Aws\Sqs\SqsClient; $client = SqsClient::factory(array( 'key' => 'your access key', 'secret' => 'your secret access key', 'region' => 'us-west-2' )); echo 'ListQueues' . "\r\n"; $result0 = $client->listQueues(array( 'QueueNamePrefix' => 'string', )); $result0->get('QueueUrls'); var_dump($result0); $result1 = $client->createQueue(array( 'QueueName' => 'kaka_queue002', 'Attributes' => array( 'VisibilityTimeout' => '300', ), )); $queueUrl = $result1->get('QueueUrl'); echo $queueUrl . "\r\n"; $result2 = $client->sendMessage(array( 'QueueUrl' => $queueUrl, 'MessageBody' => 'test message !', 'DelaySeconds' => 0, )); echo 'ReceiveMessage' . "\r\n"; $result3 = $client->receiveMessage(array( 'QueueUrl' => $queueUrl, 'MaxNumberOfMessages' => 10, 'WaitTimeSeconds' => 20, )); $body = $result3['Messages']; echo $body[0]['Body']. "\r\n"; #var_dump($result3); echo 'OK'; ?>
Composer version 80499bb02418711b34bba59c1a6d8032429e5702 2013-12-06 12:32:19
server = servernamepuppetmasterd --no-daemonize --d
file { '/etc/hosts': owner => 'root', group => 'root', mode => 644, }※.ppファイルの名前が「site」でないとうまく動作しない可能性があります
syntax enable set background=dark colorscheme solarized let g:solarized_termcolors=256
alias ls='gls --color=auto' eval $(gdircolors /var/tmp/dircolors-solarized/dircolors.ansi-universal)source ~/.bash_profile
extension=mongo.so
<?php
new MongoClient("mongodb://dbserver:27017");
?>
8967:20131204:171621.274 item [db_server:mikoomi-mongodb-plugin.sh[-h {$SERVER} -p {$PORT} -z {$HOSTNAME}]] became supported的なログが出力されれば完了です
yum -y install apachetopでインストールしてください(CentOS 6.3の場合)
<?php require_once "HTTP/Client.php"; # define $api_key = 'Please input your api_key'; $custom_search_key = 'Please input your google custom search id'; $input_file = 'keyword_list.csv'; $result_file = 'result.tsv'; $sleep_time = 10; $csv = array(); $fp= fopen($input_file, "r"); $fp2 = fopen($result_file, "w"); while (($data = fgetcsv($fp, 0, ",")) !== FALSE) { $csv[] = $data; } fclose($fp); #var_dump($csv); $client =& new HTTP_Client(); for ($i = 1; $i < count($csv); $i++) { $keyword = $csv[$i][1]; $urlStr = "https://www.googleapis.com/customsearch/v1?key=" . $api_key . "&cx=". $custom_search_key . "&q=" . urlencode($keyword) . "&alt=json"; #echo $urlStr; fwrite($fp2, $i . "\t" . $keyword . "\t"); $client->get($urlStr); $response = $client->currentResponse(); $json = json_decode($response['body'], true); if (array_key_exists('items', $json)) { $items = $json['items']; for ($j = 0; $j < count($items); $j++) { #var_dump($items); $items_info = $items[$j]; fwrite($fp2, str_replace(array("\r\n","\r","\n"), '', $items_info['title']) . "\t" . str_replace(array("\r\n","\r","\n"), '', $items_info['snippet']) . "\t"); #echo 'title : ' . str_replace(array("\r\n","\r","\n"), '', $items_info['title']) . "\r\n"; #echo 'snippet : ' . str_replace(array("\r\n","\r","\n"), '', $items_info['snippet']) . "\r\n"; } } fwrite($fp2, "\r\n"); sleep($sleep_time); } fclose($fp2); ?>
#!/usr/bin/python ↓ #!/usr/bin/python2.4明示的に2.4のバージョンのpythonを使うように変更しましょう
<?php require_once './simple_html_dom.php'; $csv = array(); $input_file = 'keyword_list.csv'; $fp= fopen($input_file, "r"); $result_file = 'result.csv'; $fp2 = fopen($result_file, "w"); $search_result_count = 10; $sleep_time = 10; while (($data = fgetcsv($fp, 0, ",")) !== FALSE) { $csv[] = $data; } fclose($fp); #var_dump($csv); for ($i = 1; $i < count($csv); $i++) { $keyword = $csv[$i][1]; $urlStr = "http://www.google.co.jp/search?num=$search_result_count&q=" . urlencode($keyword) . '&ie=utf-8&oe=utf-8'; $html = file_get_html($urlStr); $lcnt = 1; fwrite($fp2, $i . ',' . $keyword . ','); $result = ''; foreach($html->find('h3[class=r]') as $e) { $result_title = preg_replace('/<a href=(.+?)>/', '', $e->outertext); $result_title = preg_replace('/<b>|<\/b>|<\/a>|<h3 class="r">/', '', $result_title); $result_title = preg_replace('/<\/h3>/', '', $result_title); $result = $result . $result_title . ','; $lcnt++; } $result = rtrim($result, ","); fwrite($fp2, $result . "\r\n"); sleep($sleep_time); } fclose($fp2); ?>
#!/bin/sh count=1 cat /var/log/messages | while read line do date=`echo ${line} | awk '{print $1,$2,$3}'`; hostname=`echo ${line} | awk '{print $4}'`; message=`echo ${line} | awk '{for(j=5;j<NF;j++){printf("%s ",$j)}print $NF}'`; timestamp=`date "+%Y-%m-%d %H:%M:%S"` #echo ${date} ${hostname} ${message} curl -X POST http://localhost:9200/system/message/${count} -d "{\"date\":\"${date}\",\"hostname\":\"${hostname}\",\"message\":\"${message}\",\"@timestamp\":\"${timestamp}\"}" echo ${count}; count=`expr ${count} + 1`; done内容としては/var/log/messagesをdateとhostnameとmessageに分割してelasticsearchにPOSTしています
(load-file "~/.emacs.d/site-lisp/php-mode-1.5.0/php-mode.el") (push '("\\.php$" . php-mode) auto-mode-alist)
WARNING: channel "pear.php.net" has updated its protocols, use "pear channel-update pear.php.net" to update downloading Services_Amazon_SQS-0.3.0.tgz ... Starting to download Services_Amazon_SQS-0.3.0.tgz (34,121 bytes) .........done: 34,121 bytes downloading HTTP_Request2-2.1.1.tgz ... Starting to download HTTP_Request2-2.1.1.tgz (99,151 bytes) ...done: 99,151 bytes install ok: channel://pear.php.net/HTTP_Request2-2.1.1 install ok: channel://pear.php.net/Services_Amazon_SQS-0.3.0※このときpearのバージョンが1.9.2以上ないとERRORとなりインストールできませんのでpearのバージョンが古い方はpearのバージョンをアップグレードしてください
<?php require '/usr/share/pear/Services/Amazon/SQS/Queue.php'; $accesskey='your accesskeyId'; $secretkey='your secretkeyId'; $accountId='your accountId ex) 000000000000'; $queueName='kaka_queue001'; $queueURL='http://queue.amazonaws.com/' . $accountId . '/' . $queueName; // send message $sqs=new Services_Amazon_SQS_Queue($queueURL, $accesskey, $secretkey); try { $sqs->send('test message'); } catch (Services_Amazon_SQS_Exception $e) { trigger_error($e->getMessage()); } ?>
ERROR: unable to unpack /tmp/tmpACbIi9/Structures_Graph-1.0.4.tgz http://stackoverflow.com/questions/8571925/upgrading-pear-on-x86-64-gnu-linux
package com.kakakikikeke.sample.test; import java.util.HashMap; public class Test { public static void main(String[] args) { HashMap<String, String> map1 = new HashMap<String, String>(); map1.put("key1", "value"); map1.put("key2", "value"); System.out.println("* before"); for (String key: map1.keySet()) { System.out.println(key); System.out.println(map1.get(key)); } HashMap<String, String> map2 = map1; map2.remove("key1"); System.out.println("* after"); for (String key: map1.keySet()) { System.out.println(key); System.out.println(map1.get(key)); } } }
package com.kakakikikeke.sample.test;
import java.util.HashMap;
public class Test {
public static void main(String[] args) {
HashMap<String, String> map1 = new HashMap<String, String>();
map1.put("key1", "value");
map1.put("key2", "value");
System.out.println("* before");
for (String key: map1.keySet()) {
System.out.println(key);
System.out.println(map1.get(key));
}
HashMap<String, String> map2 = new HashMap<String, String>(map1);
map2.remove("key1");
System.out.println("* after");
for (String key: map1.keySet()) {
System.out.println(key);
System.out.println(map1.get(key));
}
}
}
ClientConfiguration conf = new ClientConfiguration(); conf.setProxyHost("proxyhostname"); conf.setProxyPort(8080); AWSCredentialsProvider credentialsProvider = new ClasspathPropertiesFileCredentialsProvider(); AmazonSQS sqs = new AmazonSQSClient(credentialsProvider, conf);
SetQueueAttributesRequest sqaRequest = new SetQueueAttributesRequest(); sqaRequest.setQueueUrl(queueUrl); HashMapattributes = new HashMap (); attributes.put("DelaySeconds", "900"); sqaRequest.setAttributes(attributes); sqs.setQueueAttributes(sqaRequest);
elasticsearch: "http://"+window.location.hostname+":9200", ※elasticsearchが起動しているホスト名とポート名に変更がある場合は上記を変更しますcd /var/www/html
mongod
use test1 db.addUser({user:"guest",pwd:"hogehoge",roles:["dbAdminAnyDatabase","readWrite"]})test1というDBに対してのみ認証設定を実施するので他のDBに対しては操作することはできません
mongod --authこれで認証が必要な状態でmongodが起動しました
Tue Nov 5 12:48:22.596 listDatabases failed:{ "ok" : 0, "errmsg" : "unauthorized" } at src/mongo/shell/mongo.js:46
mongo test1 -u guest -p hogehogeshow collections を実行するとコレクションの一覧を確認することができると思います
<?php echo "check_ip <br>"; require_once "/usr/share/pear/Net/CheckIP.php"; $isip = Net_CheckIP::check_ip("192.168.1.1"); if ($isip) { echo "$isip"; } ?>
cd C:\Program Files (x86)\Growl for Windows ruby check_ticket.rb
<dependency> <groupId>org.testng</groupId> <artifactId>testng</artifactId> <version>6.3.1</version> <scope>test</scope> </dependency>mvn clean
package com.kakakikikeke.test; import org.testng.annotations.Test; import org.testng.annotations.BeforeTest; import org.testng.annotations.AfterTest; public class AppTest { @BeforeTest public void beforeTest() { System.out.println("beforeTest"); } @AfterTest public void afterTest() { System.out.println("beforeTest"); } @Test public void TestCase1() { } }と記載したら
cd C:\Program Files (x86)\Growl for Windows growlnotify.exe aaagrowlは基本的にコマンドから実行して使います
Choose a number or apply filter (format: [groupId:]artifactId, case sensitive contains): 312: enter入力 Choose org.apache.maven.archetypes:maven-archetype-quickstart version: enter入力(1.1) Define value for property 'version': 1.0-SNAPSHOT: : enter入力 確認 Confirm properties configuration: groupId: com.mycompany.app artifactId: my-app version: 1.0-SNAPSHOT package: com.mycompany.app
export ELASTIC_SEARCH=/usr/local/elasticsearch export PATH=$ELASTIC_SEARCH/bin:$PATHsource /root/.bashrc
curl -X POST http://localhost:9200/dictionary/name/1 -d '{"famiry_name":"kaka","first_name":"abc","email" :"sample@sample.email.com","age":20}' curl -X POST http://localhost:9200/dictionary/name/2 -d '{"famiry_name":"kaka","first_name":"def","email" :"sample@sample.email.com","age":20}' curl -X POST http://localhost:9200/dictionary/name/3 -d '{"famiry_name":"kaka","first_name":"ghi","email" :"sample@sample.email.com","age":20}' curl -X POST http://localhost:9200/dictionary/name/4 -d '{"famiry_name":"kaka","first_name":"jkl","email" :"sample@sample.email.com","age":20}' curl -X POST http://localhost:9200/dictionary/name/5 -d '{"famiry_name":"kaka","first_name":"mno","email" :"sample@sample.email.com","age":20}'「/hoge/foo/n」のような感じでパスを自由に設定することができます
curl -X GET http://localhost:9200/dictionary/name/_search -d '{"query":{"match":{"famiry_name":"kaka"}},"size":10,"from":0}'「/hoge/foo/_search」というパスにリクエストすることで検索することができます
curl -X GET http://localhost:9200/dictionary/name/_search -d '{"query":{"regexp":{"famiry_name":"ka.*"}},"size":10,"from":0}'基本的な指定は2. と同様ですが「match」ではなく「regexp」を使用しております