Tuesday, November 15, 2016

Setting Kinesis Client Library with Android App in Android Studios


1) update the gradle file with following data

dependencies {
    compile 'com.amazonaws:aws-android-sdk-core:2.2.+'
    compile 'com.amazonaws:aws-android-sdk-s3:2.2.+'
    compile 'com.amazonaws:aws-android-sdk-kinesis:2.2.+
    compile 'com.amazonaws:aws-android-sdk-ddb:2.2.+' }

Build and sync the repo

2)
 More to Come 

Thursday, November 10, 2016

Your first serverless architecture 1.1.0 (Creating project ,functions) running offline with/without endpoint



Install node js:

          https://nodejs.org/en/download/

Install Serverless:
       
  • npm install -g serverless

Create new serverless project:

       
  • serverless create --template aws-nodejs --path my-service
  • cd my-service


inside my-service you will see 3 files 



Then you need to edit serverless.yml





where you need to edit aws-nodejs with you own service name 

you can also give same or different name to your functions

create events with http endpoint


include serverless-offline plugin and also install it by running the following line
npm install serverless-offline --save-dev


you also need to install babel transplier :

npm install --save-dev babel-preset-es2015

npm install --save-dev babel-preset-stage-2


once you are done with it run following cmd 



sls offline -r us-west-2   where -r is to change the region.

then hit localhost:3000/sqlToCass in the browser

to run it locally :

 serverless invoke local --function sqlToCass




Sunday, October 23, 2016

Installing AWS command line interface (CLI ) on MAC (tackling SIX error)


Step 1:
       
             Install Python 2.7.9 version : https://www.python.org/downloads/release/python-2712/

             
Step 2:
   
          Install PIP:  curl -O https://bootstrap.pypa.io/get-pip.py
                             sudo python27 get-pip.py       OR 
                             sudo python get-pip.py

Step 3 : 
          sudo pip install awscli

       if you get SIX error 

        USE this command : sudo pip install --ignore-installed six  awscli




Thursday, June 9, 2016

How to use elastic search suggester to perform auto complete

elastic-search suggester 

Use kibana sense to create your mappings 
Note:Dont forget to add suggest: filed with your mapping as shown below 

eg:
{
  "mappings": {
    "User": {
      "properties": {
        "fName": {
          "type": "string",
          "index": "not_analyzed"
        },  
         "suggest" : {
           "input": [ "Nevermind", "Nirvana" ],//on these input you will get the suggestions 
           "output": "Nirvana - Nevermind",//this will be your auto complete return result
           "payload" : { "artistId" : 2321 },//you can include the payload too with return resule
           "weight" : 34 //preference 
        }
     }
   }
}

  

Then to use the suggester
you need to do like shown below
On the head object like in this case "music" is parent object which has "songs"
you need to add suggester and point to suggest field


curl -X POST 'localhost:9200/music/_suggest?pretty' -d '{
    "song-suggest" : {
        "text" : "n",
        "completion" : {
            "field" : "suggest"
        }
    }
}'
To add fuzzy suggester
curl -X POST 'localhost:9200/music/_suggest?pretty' -d '{
    "song-suggest" : {
        "text" : "n",
        "completion" : {
            "field" : "suggest",
            "fuzzy" : {
                "fuzziness" : 2
            }
        }
    }
}'


that's it you will get your suggester working

NOTE: this search might  also show you deleted documents .
if you dont want this to happen then you need to index your page every time you add or update the data

As shown below :
$ curl -XPOST 'http://localhost:9200/_optimize?only_expunge_deletes=true'


for more info: https://www.elastic.co/guide/en/elasticsearch/reference/current/search-suggesters-completion.html


How to trigger events based on "Storage events" on Google Cloud


Create a storage:
https://cloud.google.com/storage/docs/quickstart-console

Add a watch
https://cloud.google.com/storage/docs/object-change-notification#_Notification_Types

Create a function that triggers on watch
https://cloud.google.com/functions/calling#google_cloud_storage

Tuesday, June 7, 2016

Dynamic Image Resize using google cloud storage(GCS) and getServing URL

you need to have 
Google cloud storage bucket name
and
following Code



Simply just use my GIT repo as follows  just change your projectID and bucket name 

USE this git repo

https://github.com/koolkarni/resize-image-google-cloud-service-GetServingURL-PHP.git

OR

index.php:


<?php
//var_dump($_FILES['uploaded_files']['tmp_name']);
syslog(LOG_WARNING, "Request came");
require_once 'google/appengine/api/cloud_storage/CloudStorageTools.php';
use google\appengine\api\cloud_storage\CloudStorageTools;
syslog(LOG_WARNING, "Imported Cloud Storage Tools");
//var_dump( $_GET);
$object_url=$_GET["image"];
$size=intval($_GET["size"]);
syslog(LOG_WARNING, "Object URL $object_url");
syslog(LOG_WARNING, "Size $size");

$bucket="gs://YOUR-PROJECT-ID.appspot.com/bucket_name/";
$object_image_url = CloudStorageTools::getImageServingUrl($object_url,['size' => $size, 'crop' => false]);
syslog(LOG_WARNING, "Output Url $object_image_url");
header("location: $object_image_url");

closelog();
?>



app.yaml:

runtime: php55
api_version: 1

handlers:
- url: /.*
script: index.php



upload these files in GIT hub repo 

once you have the code in git download it into your google cloud development-->Repo  by following the steps below:


Step1















 Step2





























Step3














Step4













Step5


























Step6











Once your code is in repo. open the GC console then 
run the following command in GC shell

 go to the project dir (i.e the folder with projectID)

gcloud source repos clone image1 --project=Prj1234


where image1: is your repo name Prj1234 is your project code

now traverse inside your image1 folder then follow the following steps


gcloud preview app deploy app.yaml







to check your output hit the URL like this where image1.jpg is file from 
your cloud storage