MongoDB Administration Checklist for MySQL DBAs

2016-06-03 12:18:10

In this blog, I discuss a MongoDB administration checklist designed to help MySQL DBAs.If you are MySQL DBA, starting MongoDB administration is not always an easy transition. Although most of the concepts and even implementation are similar, the commands are different. The following table outlines the typical MySQL concepts and DBA tasks (on the left) to corresponding MongoDB ones (on the right). If you happen to be a MongoDB DBA and want to learn MySQL administration, you can use the same table looking from right to left.I’ve also created a webinar, MongoDB administration for MySQL DBA, that explains the above concepts. You can download the slides to use as a reference.Don’t forget about our upcoming event Community Open House for MongoDB in New York, June 30, 2016. There will be technical presentations and sessions from key members of the MongoDB open source community. This event is free of charge and open to all.Architecture: Basic ConceptsReplication:Sharding:Day-to-day Operations MySQL: SELECTselect*from zips limit1G country_code:US postal_code:34050 place_name:FPO admin_name1: admin_code1:AA admin_name2:Erie admin_code2:029 admin_name3: admin_code3: latitude:41.03750000 longitude:-111.67890000 accuracy: 1row inset(0.00sec)MongoDB: FIND MongoDB shell version:3.0.8 connecting to:zips >db.zips.find().limit(1).pretty() "_id":"01001", "city":"AGAWAM", "loc":[ -72.622739, 42.070206 "pop":15338, "state":"MA"MySQL: SchemaCREATE TABLE users( idMEDIUMINT NOTNULLAUTO_INCREMENT, user_id varchar(30), age Number, status char(1), PRIMARY KEY(id)MongoDB: Flexible Schemadb.users.insert({ user_id:"abc123", age:55, status:"A"}MySQL: Config fileMongoDB:/etc/mongod.conf # Where and how to store data. storage: dbPath:/datawt journal: enabled:true engine:wiredTiger ... /usr/bin/mongod-f/etc/mongod.confMySQL: databasesDatabases mysql>show databases; +--------------------+ |Database| +--------------------+ |information_schema| ... mysql>usezips Database changed Tables mysql>show tables; +----------------+ |Tables_in_zips| +----------------+ |zips| +----------------+MongoDB: DatabasesDatabases >show dbs; admin0.000GB local0.000GB osm13.528GB test0.000GB zips0.002GB >usezips switched todb zips Collections >show collections zips >show tables//same zipsMySQL: Storage EnginesMyISAM InnoDB TokuDB MyRocksMongoDB: Storage EnginesMMAPv1:memory mapped WiredTiger:transactional+compression TokuMX/PerconaFT RocksDBMySQL: Processlistmysql>show processlistG Id:137259 User:root Host:localhost db:geonames Command:Query Time:0 State:init Info:show processlist Rows_sent:0 Rows_examined:0 1row inset(0.00sec)MongoDB: CurrentOp()>db.currentOp() "inprog":[ "desc":"conn28", "threadId":"0x19b85260", "connectionId":28, "opid":27394208, "active":true, "secs_running":3, "microsecs_running": NumberLong(3210539), "op":"query", "ns":"osm.points3", "query":{ "name":"Durham" "planSummary":"COLLSCAN", "client":"", "numYields":24905, "locks":{ "Global":"r", "Database":"r", "Collection":"r" "waitingForLock":false, ...MySQL: Grantsmysql>grant all on*.* touser@localhost identified by'pass';MongoDB: createUser>useproducts db.createUser({ user:"accountUser", pwd:"password", roles:["readWrite","dbAdmin"]MySQL: Indexmysql>show keys from zipsG ****1.row**** Table:zips Non_unique:0 Key_name:PRIMARY Seq_in_index:1 Column_name:id Collation:A Cardinality:0 Sub_part:NULL Packed:NULL Null: Index_type:BTREEMongoDB: Index>db.zips.getIndexes() "v":1, "key":{ "_id":1 "name":"_id_", "ns":"zips.zips"MySQL: Add Indexmysql>alter table zips add key(postal_code); Query OK,0rows affected(0.10sec) Records:0Duplicates:0Warnings:0MongoDB: Create Index>db.zips.createIndex({state:1}) "createdCollectionAutomatically":false, "numIndexesBefore":1, "numIndexesAfter":2, "ok":1 //Index can be sorted: >db.zips.createIndex({state:-1}) "createdCollectionAutomatically":false, "numIndexesBefore":2, "numIndexesAfter":3, "ok":1MySQL: Explainmysql>explain select*from zips where place_name='Durham'G ****1.row**** id:1 select_type:SIMPLE table:zips type:ref possible_keys:place_name key:place_name key_len:183 ref:const rows:25 Extra:Using index condition 1row inset(0.00sec)MongoDB: Explain>db.zips.find({"city":"DURHAM"}).explain() "queryPlanner":{ "plannerVersion":1, "namespace":"zips.zips", "indexFilterSet":false, "parsedQuery":{ "city":{ "$eq":"DURHAM" "winningPlan":{ "stage":"COLLSCAN", "filter":{ "city":{ "$eq":"DURHAM" "direction":"forward" "rejectedPlans":[] "serverInfo":{...}, "ok":1MySQL: Alter Tablemysql>alter table wikistats_innodb_n add url_md5 varbinary(16); Query OK,0rows affected(37min10.03sec) Records:0Duplicates:0Warnings:0 mysql>update wikistats_innodb_n set url_md5=unhex(md5(lower(url))); Query OK, 85923501rows affected(42min29.05sec) Rows matched:85923511Changed:85923501...MongoDB: Flexible schemaNo Alter statement. Just insert the newdocument version...Different documents can have different schema versionsMySQL: Slow Query Logmysql>set global long_query_time=0.1; Query OK,0rows affected(0.02sec) mysql>set global slow_query_log=1; Query OK,0rows affected(0.02sec) mysql>show global variables like'slow_query_log_file'G ****1.row**** Variable_name:slow_query_log_file Value:/var/lib/mysql/slow.log 1row inset(0.00sec)MongoDB: Profiling//db.setProfilingLevel(level,slowms) //Level:0=no profiling, //1=only slow ops //2=all ops //Slowms same aslong_query_time //inmilliseconds >db.setProfilingLevel(2,100); {"was":0,"slowms":100,"ok":1} >db.system.profile.find( {millis: {$gt:100} }).pretty() "op":"query", "ns":"zips.zips", "query":{ "city":"DURHAM" "ntoreturn":0,MySQL: Percona Toolkit$pt-query-digest--limit100slow.log > Mtools$mlogfilter mongo.log-20150915 --from'Sep 14 06:00:00'--to'Sep 14 23:59:59' |mlogvis--line-max100000--out'mongo.html'MySQL 5.7: GISSELECT osm_id,name, round(st_distance_sphere(shape, st_geomfromtext( 'POINT (-78.9064543 35.9975194)',1) ),2)asdist, st_astext(shape) FROM points_new WHERE st_within(shape, create_envelope(@lat,@lon,10)) and(other_tags like'%"amenity"=>"cafe"%' orother_tags like'%"amenity"=>"restaurant"%') andname isnotnull ORDER BY dist asc LIMIT10;MongoDB 3.2: GISdb.runCommand({ geoNear:"points", near:{ type:"Point", coordinates: [-78.9064543,35.9975194] spherical:true, query:{ name:{ $exists:true,$ne:null}, "other_tags":{$in:[ /.*amenity=>restaurant.*/, /.*amenity=>cafe.*/] "limit":5, "maxDistance":10000MySQL: BackupBackup: mysqldump -A > dump.sql, restore: mysql < dump.sqlStop replication slave, copy filesPercona XtraBackupMongoDB: BackupBackup: mongodump, restore mongorestoreStop replica, copy filesPercona HotBackup for TokuMX only

Push for encryption law falters despite Apple case spotlight

2016-05-27 11:13:25

WASHINGTON/ SAN FRANCISCO After a rampage that left 14 people dead in San Bernardino, key U.S. lawmakers pledged to seek a law requiring technology companies to give law enforcement agencies a "back door" to encrypted communications and electronic devices, such as the iPhone used by one of the shooters.Now, only months later, much of the support is gone, and the push for legislation dead, according to sources in congressional offices, the administration and the tech sector. Draft legislation that Senators Richard Burr and Dianne Feinstein, the Republican and Democratic leaders of the Intelligence Committee, had circulated weeks ago likely will not be introduced this year and, even if it were, would stand no chance of advancing, the sources said.Key among the problems was the lack of White House support for legislation in spite of a high-profile court showdown between the Justice Department and Apple Inc over the suspect iPhone, according to Congressional and Obama Administration officials and outside observers."They've dropped anchor and taken down the sail," former NSA and CIA director Michael Hayden said.For years, the Justice Department lobbied unsuccessfully for a way to unmask suspects who "go dark," or evade detection through coded communications in locked devices.When the Federal Bureau of Investigation took Apple to court in February to try to open the iPhone in its investigation of the San Bernardino slayings, the cause gained traction in Washington. The political landscape had shifted - or so it seemed.The short life of the push for legislation illustrates the intractable nature of the debate over digital surveillance and encryption, which has been raging in one form or another since the 1990s.Tech companies, backed by civil liberties groups, insist that building law enforcement access into phones and other devices would undermine security for everyone-including the U.S. government itself.Law enforcement agencies maintain they need a way to monitor phone calls, emails and text messages, along with access to encrypted data. Polls show the public is split on whether the government should have access to all digital data.The legal battle between the FBI and Apple briefly united many around the idea that Congress - not the courts - should decide the issue. But the consensus was fleeting. Feinstein's Democratic colleagues on the Intelligence Committee - along with some key Republicans - backed away. The House never got on board.The CIA and NSA were ambivalent, according to several current and former intelligence officials, in part because officials in the agencies feared any new law would interfere with their own encryption efforts.Even supporters worried that if a bill were introduced but failed, it would give Apple and other tech companies another weapon to use in future court battles.Burr had said repeatedly that legislation was imminent.    But last week, he and Feinstein told Reuters there was no timeline for the bill. Feinstein said she planned to talk to more tech stakeholders, and Burr said, “be patient.”In the meantime, tech companies have accelerated encryption efforts in the wake of the Apple case. The court showdown ended with a whimper when the FBI said it had found a way to get into the phone, and subsequently conceded privately it had found nothing of value. THE FBI GOES TO BATTLEA week after the San Bernardino attack, Burr told Reuters passing encryption legislation was urgent because "if we don't, we will be reading about terrorist attacks on a more frequent basis."FBI Director James Comey told the Senate Intelligence Committee soon after that encryption was “overwhelmingly affecting" the investigation of murders, drug trafficking and child pornography.A week later, the Justice Department persuaded a judge to issue a sweeping order demanding Apple write software to open an iPhone used by San Bernardino suspect Sayeed Farook, who died in a shootout with law enforcement. Apple fought back, arguing, among other things, that only Congressional legislation could authorize what the court was demanding. Many saw the Justice Department's move as a way to bring pressure on Congress to act.President Obama appeared to tacitly support Comey's court fight and the idea that there should be limits on criminal suspects' ability to hide behind encryption. But even as the drive for legislation seemed to be gaining momentum, consensus was dissipating.Senator Lindsey Graham, an influential Republican, withdrew support in a sudden about-face.“I was all with you until I actually started getting briefed by the people in the intel community,” Graham told Attorney General Loretta Lynch during a hearing in March. “I’m a person that’s been moved by the arguments of the precedent we set and the damage we may be doing to our own national security.”On the Democratic side, Senator Ron Wyden vowed to filibuster what he called a "dangerous proposal," that "would leave Americans more vulnerable to stalkers, identity thieves, foreign hackers and criminals." Senator Mark Warner advanced a competing bill to form a commission to study the issue. A half dozen people familiar with the White House deliberations said they were hamstrung by a long-standing split within the Obama Administration, pitting Comey and the DOJ against technology advisors and other agencies including the Commerce and State Departments.[L2N16C1UC]They also said there was reluctance to take on the tech industry in an election year. (Reporting by Dustin Volz and Mark Hosenball in Washington and Joseph Menn in San Francisco; Editing by Jonathan Weber and Lisa Girion)

Scalaz Features for Everyday Usage Part 3: State Monad, Writer Monad, and Lenses

2016-05-19 20:13:05

In this article in the mini-series on Scalaz, we'll look at a couple of additional monads and patterns available in Scalaz. Once again, we'll look at stuff that is practical to use and avoid the inner details or Scalaz. To be more precise, in this article we'll look at:Writer monad: Keep track of a sort of logging during a set of operationsState monad: Have an easy way of tracking state across a set of computationsLenses: Easily access deeply nested attributes and make copying case classes more convenientWe'll start with one of the additional monads provided by Scalaz.Writer MonadBasically each writer has a log and a return value. This way you can just write your clean code, and at a later point determine what you want to do with the logging (e.g validate it in a test, output it to the console, or to some log file). So for instance, we could use a writer to keep track of the operations we've executed to get to some specific value.So let's look at the code and see how this thing works:import scalaz._ import Scalaz._ object WriterSample extends App { // the left side can be any monoid. E.g something which support // concatenation and has an empty function: e.g. String, List, Set etc. type Result[T] = Writer[List[String], T] def doSomeAction() : Result[Int] = { // do the calculation to get a specific result val res = 10 // create a writer by using set res.set(List(s"Doing some action and returning res")) } def doingAnotherAction(b: Int) : Result[Int] = { // do the calculation to get a specific result val res = b * 2 // create a writer by using set res.set(List(s"Doing another action and multiplying $b with 2")) } def andTheFinalAction(b: Int) : Result[String] = { val res = s"bb:$b:bb" // create a writer by using set res.set(List(s"Final action is setting $b to a string")) } // returns a tuple (List, Int) println(doSomeAction().run) val combined = for { a <- doSomeAction() b <- doingAnotherAction(a) c <- andTheFinalAction(b) } yield c // Returns a tuple: (List, String) println( }In this sample we've got three operations that do something. In this case, they don't really do that much, but that doesn't matter. The main thing is that instead of returning a value, we return a Writer (note that we could have also created the writer in the for comprehension), by using the set function. When we call run on a Writer, we don't just get the result of the operation, but also the aggregated values collected by the Writer. So when we writetype Result[T] = Writer[List[String], T] def doSomeAction() : Result[Int] = { // do the calculation to get a specific result val res = 10 // create a writer by using set res.set(List(s"Doing some action and returning res")) } println(doSomeAction().run)The result looks like this: (List(Doing some action and returning res),10). Not that exciting, but it becomes more interesting when we start using the writers in a for-comprehension.val combined = for { a <- doSomeAction() b <- doingAnotherAction(a) c <- andTheFinalAction(b) } yield c // Returns a tuple: (List, String) println( you look at the output from this you'll see something like(List(Doing some action and returning res, Doing another action and multiplying 10 with 2, Final action is setting 20 to a string) ,bb:20:bb)As you can see we've gathered up all the different log messages in a List[String] and the resulting tuple also contains the final calculated value.When you don't want to add the Writer instantiation in your functions you can also just create the writers in a for-comprehension like so: val combined2 = for { a <- doSomeAction1() set(" Executing Action 1 ") // A String is a monoid too b <- doSomeAction2(a) set(" Executing Action 2 ") c <- doSomeAction2(b) set(" Executing Action 3 ") // c <- WriterT.writer("bla", doSomeAction2(b)) // alternative construction } yield c println( result of this sample is this:( Executing Action 1 Executing Action 2 Executing Action 3 ,5)Cool right? For this sample we've only shown the basic Writer stuff, where the type is just a simple type. You can of course also create Writer instances from more complex types. An example of this can be found here.State MonadAnother interesting monad is the State monad, which provides a convenient way to handle state that needs to be passed through a set of functions. You might need to keep track of results, need to pass some context around a set of functions, or require some (im)mutable context for another reason. With the (Reader monad) we already saw how you could inject some context into a function. That context, however, wasn't changeable. With the state monad, we're provided with a nice pattern we can use to pass a mutable context around in a safe and pure manner.Let's look at some examples: case class LeftOver(size: Int) /** A state transition, representing a function `S => (S, A)`. */ type Result[A] = State[LeftOver, A] def getFromState(a: Int): Result[Int] = { // do all kinds of computations State[LeftOver, Int] { // just return the amount of stuff we got from the state // and return the new state case x => (LeftOver(x.size - a), a) } } def addToState(a: Int): Result[Int] = { // do all kinds of computations State[LeftOver, Int] { // just return the amount of stuff we added to the state // and return the new state case x => (LeftOver(x.size + a), a) } } val res: Result[Int] = for { _ <- addToState(20) _ <- getFromState(5) _ <- getFromState(5) a <- getFromState(5) currentState <- get[LeftOver] // get the state at this moment manualState <- put[LeftOver](LeftOver(9000)) // set the state to some new value b <- getFromState(10) // and continue with the new state } yield { println(s"currenState: $currentState") a } // we start with state 10, and after processing we're left with 5 // without having to pass state around using implicits or something else println(res(LeftOver(10)))As you can see, in each function we get the current context, make some changes to it, and return a tuple consisting of the new state and the value of the function. This way each function has access to the State, can return a new one, and returns this new state together with the function's value as a Tuple. When we run the above code we see the followingcurrenState: LeftOver(15) (LeftOver(8990),5)As you can see each of the functions does something with the state. With the get[S] function we can get the value of the state at the current moment, and in this example we print that out. Besides using the get function, we can also set the state directly using the put function.As you can see, a very nice and simple to use pattern and great when you need to pass some state around a set of functions.LensesSo enough with the monads for now, let's look at Lenses. With Lenses it is possible to easily (well easier than just copying case classes by hand) change values in nested object hierarchies. Lenses can do many things, but in this article I'll introduce just some basic features. First, the code:import scalaz._ import Scalaz._ object LensesSample extends App { // crappy case model, lack of creativity case class Account(userName: String, person: Person) case class Person(firstName: String, lastName: String, address: List[Address], gender: Gender) case class Gender(gender: String) case class Address(street: String, number: Int, postalCode: PostalCode) case class PostalCode(numberPart: Int, textPart: String) val acc1 = Account("user123", Person("Jos", "Dirksen", List(Address("Street", 1, PostalCode(12,"ABC")), Address("Another", 2, PostalCode(21,"CDE"))), Gender("male"))) val acc2 = Account("user345", Person("Brigitte", "Rampelt", List(Address("Blaat", 31, PostalCode(67,"DEF")), Address("Foo", 12, PostalCode(45,"GHI"))), Gender("female"))) // when you now want to change something, say change the gender (just because we can) we need to start copying stuff val acc1Copy = acc1.copy( person = acc1.person.copy( gender = Gender("something") ) )In this sample we defined a couple of case classes, and want to change a single value. For case classes this means that we have to start nesting a set of copy operations to correctly change one of the nested values. While this can be done for simple hierarchies, it quickly becomes cumbersome. With lenses you're offered a mechanism to do this in a composable way:val genderLens = Lens.lensu[Account, Gender]( (account, gender) => account.copy(person = account.person.copy(gender = gender)), (account) => account.person.gender ) // and with a lens we can now directly get the gender val updated = genderLens.set(acc1, Gender("Blaat")) println(updated) #Output: Account(user123,Person(Jos,Dirksen,List(Address(Street,1,PostalCode(12,ABC)), Address(Another,2,PostalCode(21,CDE))),Gender(Blaat)))So we define a Lens, which can change a specific value in the hierarchy. With this lens we can now directly get or set a value in a nested hierarchy. We can also create a lens which modifies a value and returns the modified object in one go by using the =>= operator. // we can use our base lens to create a modify lens val toBlaBlaLens = genderLens =>= (_ => Gender("blabla")) println(toBlaBlaLens(acc1)) # Output: Account(user123,Person(Jos,Dirksen,List(Address(Street,1,PostalCode(12,ABC)), Address(Another,2,PostalCode(21,CDE))),Gender(blabla))) val existingGender = genderLens.get(acc1) println(existingGender) # Output: Gender(male)And we can use the >=> and the <=< operators to combine lenses together. For example in the following code sample, we create to separate lenses which are then combined and executed: // First create a lens that returns a person val personLens = Lens.lensu[Account, Person]( (account, person) => account.copy(person = person), (account) => account.person ) // get the person lastname val lastNameLens = Lens.lensu[Person, String]( (person, lastName) => person.copy(lastName = lastName), (person) => person.lastName ) // Get the person, then get the lastname, and then set the lastname to // new lastname val combined = (personLens >=> lastNameLens) =>= (_ => "New LastName") println(combined(acc1)) # Output: Account(user123,Person(Jos,New LastName,List(Address(Street,1,PostalCode(12,ABC)), Address(Another,2,PostalCode(21,CDE))),Gender(male)))ConclusionThere are still two subjects I want to write about, and those are Validations and Free monads. In the next article in this series I'll show how you can use ValidationNEL for validations. Free Monads, however, don't really fall in the category of everyday usage, so I'll spend a couple of other articles on that in the future.

Microservices Arrived at Your Home

2016-05-11 17:13:05

As there are more and more things being connected to the Internet, necessarily there is a need to integrate these devices together. We have some great opportunities to be really productive in partitioning huge problems into small and even smaller and solve them one by one. We can easily develop a simple service, put it into a Docker container and deploy it to any cloud solution. Later we can connect the services together and let them do a huge job.The services are being developed in worldwide spread teams and integrated together as needed. The same good old service oriented architecture principles apply here as well. However, the integration part is the one that has changed from the past. We no longer put the services together into a single application container. We rather deploy them standalone. This freedom allows us to spawn more instances of the same service to handle higher load, it is more failure resilient (one failed deployment does not necessarily break the others when we use circuit breakers), we can use less powerful virtual machines for hosting the services, and we believe you can come up with even more advantages.However, how do we develop these so called microservices? Or even better, how do we reuse our existing code and convert it to microservices? How could our developers use the skills they already have to develop the microservices? And how do we leverage all the great enterprise solutions deployed behind the gateway? It seems like we might use some smart glue. Fortunately, there is one, it is called SilverWare.By an example, we would like to show you how you can easily develop, integrate and deploy microservices using the skills you already know - Java and CDI.The scenario is built around controlling an intelligent home. We will be initiating actions from a mobile device (phone, tablet), processing them in a Business Rules Management System, creating commands and passing them through a workflow to individual things/actuators, and monitoring the intelligent home status that will be displayed on the mobile device and in the rules system to derive further commands. The status of the Things is going to be cached for later processing by the Rules and for the case the Things are offline or cannot provide their status. All this gets along with the Reference architecture of the Internet of Things.The software pieces used in the example are:JBoss Business Rules Management System (with its core component called Drools) for the decision engine JBoss A-MQ for MQTT messaging and topics to pass events/messages between services Apache Camel (that is part of JBoss Fuse) for integrating services, topics and Things, and also for implementing the workflow to pass Commands to individual Things OpenShift Enterprise 3 to host all the previous parts and microservices SilverWare as the secret sauce that binds it all together The cool part is a real 1:20 model of an intelligent home that contains the following “live” parts: LED lights, TV set (only audio), servo controlled door and window, A/C fan, fireplace (with a light bulb emulating the fire), and a RFID reader to find out who is at home.We have three key players in the demo - the mobile phone invoking actions and monitoring the status of the whole system, the intelligent home model that receives commands and reports its state, and the OpenShift v3 installation hosting all the software components.The home hardware is driven by Raspberry Pi running Bulldog, the universal Java library for accessing hardware pins on ARM boards, and SilverSpoon, the set of Camel components to communicate with various sensors. All the controls are available to the outside world via REST. There is JBoss A-MQ running on the Raspberry Pi hosting a MQTT topic where the information about the home are posted. The mobile phone invokes Actions via REST requests to the microservices hosted on OpenShift. It also displays status about the home from a MQTT topic hosted on JBoss A-MQ running in OpenShift.In the parts that we have implemented we stuck to the IoT Reference Architecture. The devices (mobile phone and the home) generate actions, these are processed through the Business Rules Management System (Drools) and the resulting Commands are passed through a workflow that sends them to the correct consumers represented by the individual Things. We also cache the state of the Things for case they do not report their state back or they go offline. Simply the latest state sent through a Command for the particular Thing is stored in the cache. For simplicity, we use a hash map as a cache. Later we plan to add a distributed cache like JBoss Data Grid.Currently, the workflow is implemented using Camel routes, however, a full-blown Business Process Management System (like JBoss BPM Suite) can be used here.It is all implemented using four microservices written as CDI beans. The microservices are hosted in a single fat JAR file using the SilverWareplatform. SilverWare is just an integration layer that manages lifecycle of various frameworks (i.e. service providers) and their components. Such a service provider is Weld (CDI reference implementation) and its components are CDI beans. More details on SilverWare framework and its idea can be found in our previous article.The source code for the intelligent home and all the microservices can be found at GitHub.Let’s have a look on a typical use case of the whole system. A user pushes a button on their mobile phone which creates a corresponding Action that is consumed by the MobileGateway Microservice. This microservice has a public REST API and converts the requests to Java objects that are serialized and sent to the Actions MQTT topic. All the Actions are picked up by the Drools Microservice and passed through the Business Rules Engine. Based on the user defined rules, there are typically several Commands generated as a response to an Action. The Commands are sent to the Commands MQTT topic. The state changes carried in the Commands are stored in a cache for later use in the Rules Engine. At the same time the Commands are passed to the workflow which routes the Commands to the corresponding REST APIs installed in the intelligent home. Some of the Commands can also provide updates to the mobile phone. We’ll get to this in a moment. The intelligent home periodically publishes status updates like indoor temperature, humidity and RFID tags present in the home. These are published to the Status update MQTT topic and consumed in the Weather Microservice (probably a candidate to be renamed) which converts them to Actions. Such status update actions generate Commands that provide status information back to the mobile phone through the Mobile update topic. By this we covered all the microservices and routes in the diagram above.Now we can inspect some pieces of the implementation. For example the Drools Microservice that consumes Actions from the Action topic, passes them through Business Rules Engine and sends the resulting Commands to the Commands workflow.@Microservice public class DroolsMicroservice { private static final Logger log = LogManager.getLogger(DroolsMicroservice.class); // KieSession is not thread safe, we need to synchronize calls private static Semaphore sync = new Semaphore(1); @Inject @KSession private KieSession session; @Inject @MicroserviceReference private ProducerTemplate producer; @Inject @MicroserviceReference private CacheMicroservice cache; public void processActions(final List actions) throws InterruptedException {"Firing rules for action {}", actions); sync.acquire(); try { final EntryPoint entryPoint = session.getEntryPoint("actions"); session.setGlobal("producer", producer); session.setGlobal("cache", cache.getCache()); session.registerChannel("commands", cmd -> producer.asyncSendBody("direct:commands", cmd)); actions.forEach(entryPoint::insert); session.fireAllRules(); } finally { sync.release(); } } }In the CDI bean annotated with @Microservice, we inject BRMS Knowledge session, Camel message producer and Cache Microservice. Actions from the Actions topic are directly routed to the processActions() method. We synchronize the calls as KieSession is not thread-safe. The Actions are passed as facts through an entry point to the Knowledge Session as well as the Camel message producer and the current cache with state of individual Things. For the output of newly created Commands we register a channel that sends the Commands to the corresponding Camel route connected to the Commands workflow.A sample business rule that reacts to the mobile phone’s button to set an evening mood in the home looks like this:rule "Evening mood action" when $mood: MoodAction(mood == MoodAction.Mood.EVENING) from entry-point "actions" then System.out.println("Processing mood evening"); channels["commands"].send(new BatchLightCommand( new LightCommand(LightCommand.Place.ALL, LedState.ON), new LightCommand(LightCommand.Place.LIVINGROOM_FIREPLACE, new LedState(10, 10, 10)), new LightCommand(LightCommand.Place.LIVINGROOM_LIBRARY, new LedState(10, 10, 10)), new LightCommand(LightCommand.Place.LIVINGROOM_COUCH, new LedState(10, 10, 10)) )); channels["commands"].send(new FireplaceCommand(FireplaceCommand.Fire.HEAT)); channels["commands"].send(new MediaCenterCommand(MediaCenterCommand.Media.NEWS)); endUpon the receival of the MoodAction with the Evening mood value, we create light commands to set light conditions, we set on the fireplace and turn on the media center with the news channel.More details on Drools can be find in Drools Documentation.Apache Camel is a great project to implement Enterprise Integration Patterns. Camel is very easy to use because the integration Camel routes are self describing. For example the route to read messages from the Actions topic and send them to the Drools Microservice looks like follows:from("mqtt:inActions?subscribeTopicName=ih/message/actions&userName=mqtt&password=mqtt&host=tcp://" + mqttHost) .unmarshal().serialization().bean("droolsMicroservice", "processAction");The source code is managed by Maven. This allows us to easily add the SilverWare secret sauce to bring all the components to life. This is done by adding a few dependencies to our project’s pom.xml (in addition to the components we already use in the project). io.silverware microservices io.silverware cdi-microservice-provider io.silverware camel-microservice-provider io.silverware camel-cdi-integration io.silverware drools-microservice-provider This automatically takes care of all the resources that are known to SilverWare and its providers. In this case, all CDI beans annotated with @Microservice, all Camel routes and Knowledge JAR files are discovered, made available to other components and started. As a result we get an executable JAR file with all the libraries in a separate lib directory. The overall size of the deployment is roughly 30MB.It is now possible to deploy the application to OpenShift v3 and manage it there. For this reason we created a template to prepare everything that is needed for a successful deployment to OpenShift v3. One of the core and very interesting components is S2I(Source To Image) which is a tool for building “ready to run” Docker images from sources. However, given its complexity, we will follow up with another article describing how to deploy SilverWare Microservices to OpenShift v3.In this demonstration we showed that we can easily develop simple microservices by using the SilverWare framework, we can invoke BRMS Knowledge sessions from them and link it all together by Camel and JBoss A-MQ. All that running in OpenShift v3 deployed as Docker containers. It does not need anything else than writing a few microservices and Camel routes - the components that provide the real business value and nothing else, no boilerplate code was needed. We did not need to learn any new principles and tools.

Diving into Visual Studio 2015 (Day #1) : Code Assistance

2016-05-10 03:15:05

In this series, I’ll cover how development with Visual Studio 2015 can increase your productivity and enable you to write cleaner and more optimized code.Download - 1.2 MBIntroductionI have always been a great admirer of Visual Studio IDE (Interactive Development Environment). Visual Studio has proved to be the best IDE for me and I use it for almost all my coding as well as debugging work. My love for the IDE has forced me to start a series of articles to explain what Visual Studio 2015 now offers to a developer in terms of cross-platform development, cloud-based development, code assistance, refactoring, debugging, and more. The power of Visual Studio is not only limited to development and coding but it offers a one-stop solution to all the requirements needed during coding, development, code analysis, or deployment. I’ll use Visual Studio Enterprise 2015 throughout the series and explain how one can leverage Visual Studio to be more productive. In this section of the series, I’ll cover how development with Visual Studio 2015 can increase your productivity and enable you to write cleaner and more optimized code.Code AssistanceIn earlier versions of Visual Studio, you must have seen that whenever you write a buggy code, the code editor provides suggestion with the help of a tool tip. This feature has improved a lot and is shown as a light bulb icon in the Visual Studio code editor. This option provides you with the real-time suggestions while coding in Visual Studio code editor to improve code quality or fix the coding issues. It helps you in identifying syntax errors, provides useful code hints and assists you with static code analysis. I am using a sample code to explain the enhancements, for that I have created a console application in my Visual Studio and named it VisualStudio2015ConsoleApplication.Syntax Error SuggestionsSuppose there is syntax error in your code like I purposely did in the below image:The light bulb icon immediately shows up when you click your mouse over the erroneous variable having a red line and displays an issue summary, an error code with a link to documentation. It also displays a possible list of code fixes and refactorings. In the above example, I am writing an add method, taking two parameters a and b, but I am trying to return a result as a+bh. Now since "bh" is not declared anywhere in the method or passed as a parameter, the light bulb icon shows up and provides certain possible options or suggestions about how this variable can be taken care of. It suggests generating a variable named "bh", create a field or property as well.If you hover on the error line you’ll be shown a light bulb icon showing error and potential fixes.Note that you can also use Ctrl+. to see the error using your keyboard. If you click on Show potential fixes, you’ll get the same options as shown in the first image. Alternatively, if by any chance you doubt the light bulb icon and build your console application, you’ll again be shown the same error as follows:The syntax error assistance displays a light bulb icon, a description of the error, and a link to show possible fixes as well. When you click on the error code i.e. CS0103 you’ll be redirected to the documentation of the error code. It also offers to preview changes once you go for any of the suggestions provided by the light bulb icon. So if I click on Preview changes it shows me the preview of the option that I have chosen as shown below.Now we don’t have to go to the code and explicitly define that variable. Just click on Apply button and Visual Studio takes care of everything. Therefore, the first option that I chose is now reflected in my code.I remember that I used to do these on the fly modifications to improve productivity using ReSharper. We saw that we got the same error on compiling the application which proves that light bulb icon’s code suggestion can help us write error free code before even compiling the application and getting to know about the actual error after a compile. Therefore, we don’t have to wait to compile the application to know about compile time error. You can test different scenarios to explore the syntax error suggestions given by the light bulb icon in our day to day programming.Code SuggestionsLet’s take another scenario. Suppose I define an interface named ICalculator and add a class named Calculator and inherit the calculator.cs class from that interface.Interface1: interface ICalculator 2: { 3: int Add(int a, int b); 4: int Subtract(int a, int b); 5: int Multiply(int a, int b); 6: float Divide(float a, float b); 7: } Class1: public class Calculator : ICalculator 2: { 3: } You’ll see that there will be a red error line under the ICalculator interface named in calculator class. You can get to see the light bulb icon in the same way as shown in previous example i.e. hover or click on error. Here you’ll see that light buld icon is assisting us with some additional conceptual information, like that the interface that we are using contains several methods that needs to be implemented in the Calculator class.Therefore, we see that light bulb not only assists us in finding syntax errors but also suggests a conceptual or logical resolution of mistakes we make in our programming. When you show on "Show potential fixes link" it will show all the possible fixes for this error in a detailed user-friendly manner with an option to resolve and fix it with preview as well.In the above image, you can see that the code assistance is providing an option to either implicitly and explicitly implement an interface ICalculator, and if we analyse we can clearly say that these are the only possible options that a developer may opt for in this scenario. Moreover, it shows the error does link referring to its description. If you choose the first option and choose preview changes link, you’ll see the following preview and you can choose to apply that if it is what you need.So click apply and we get the following class with all the interface methods having default implementations. 1: public class Calculator : ICalculator 2: { 3: public int Add(int a, int b) 4: { 5: throw new NotImplementedException(); 6: } 7: 8: public float Divide(float a, float b) 9: { 10: throw new NotImplementedException(); 11: } 12: 13: public int Multiply(int a, int b) 14: { 15: throw new NotImplementedException(); 16: } 17: 18: public int Subtract(int a, int b) 19: { 20: throw new NotImplementedException(); 21: } 22: } Likewise, the light bulb icon provides numerous code suggestion options while development and coding following which we can increase the productivity of writing code without unnecessarily compiling the application and writing the code manually.Refactoring SuggestionsThe light bulb icon is not only limited to code suggestions and syntax error suggestions but also comes with a great capability of refactoring techniques. When I added the calculator class, the class was added with few default namespaces as shown below.In the above scenario as we can see there are a few namespaces added by default in the class that are currently not used. When you hover the mouse over those namespaces, the light bulb icon shows up with some refactoring suggestions as shown below.The above image shows the suggestion of Lightbulb icon asking to remove "unnecessary usings." We see here that Visual Studio is smart enough to know what refactoring is required in the code and accordingly can suggest the developer to optimize the code. If you apply the suggestion it will remove the unnecessary "usings" from your code. You can also select to fix all occurrences of this issue in the current document, the project, or the solution. If we just want to make this local change, we can select Remove Unnecessary Usings here, and the unused usings are removed.Quick Suggestions and RefactoringNow when I go to my Calculator.cs class and define the Add method as follows.1: public int Add(int a, int b) 2: { 3: int c = a + b; 4: return c; 5: } It is the correct way of defining an add method, but on second thought, what if I want to refactor or optimize this method? Visual Studio 2015 provides us with the facility to do quick refactoring of code with suggestions using an option enabled in the context menu of the editor. Just right-click on "c" and you’ll get to see an option at the top of the context menu saying "Quick Actions and Refactorings…".Note that in above code block, Visual Studio didn’t suggest anything implicitly and due to some syntax error but we have the option to choose and ask for Visual Studio’s help explicitly to know if a particular written code could be enhanced, optimized, refactored more or not. There could be cases that choosing this option too does not show any suggestion to improve code, which means your code is already refactored and optimized. But in the above-mentioned scenario, if we select the "Quick Actions and Refactorings…" option, VS gives us two options to further optimize the code.or,If we have a glance over both the options, the first says to skip using temporary variable and just return (a+b) (which is a good suggestion by the way) and the second option says to extract a method out of the code and put (a+b) in any other method and return from there. Now in these situations, it is the choice of the developer on what option he chooses. I choose the first option and apply the changes that it showed me in preview and I got following code which looks better that the earlier one.1: public int Add(int a, int b) 2: { 3: return a + b; 4: } Note that these are the small examples that I am taking to just explain the power of Visual Studio 2015. There could be complex and tricky scenarios where you may actually need a lot of help through these features.ConclusionI took few representative examples from a large set of possible scenarios where Code assistant of Visual Studio 2015 can help you. You can explore the IDE and play around to see more such situations where you feel the change from earlier versions of Visual Studio to this one. In the next section, I’ll talk about Live Static Code Analysis. For more technical articles you can reach out to my personal blog, CodeTeddy.References

Older PostNewer Post
We could finally be getting a diverse 'Bachelorette'
El Nino sends rare tropical visitors to California waters
Italian consortium set to win giant Chile telescope contract