Monday, August 17, 2015

My TODO List with MongoDB, Express, Angular and Node

MEAN Is as MEAN Does!

The motivation for this paper is to continue with my training into the MEAN stack of technologies.  Recently I had the pleasure to meet and share some learning time at the Vancouver MongoDB User Group (MUG).  The meeting was sponsored by Light House Labs; they donated the space and the very much appreciated, classic-meet-up-cannot-be-without: pizza and "healthy drinks" (some sort of zero calories patented combination of salts and flavors).  

The meeting was in one word, inspiring:  there is so much to learn!  So much to share!  That is how we landed on the moon!  Talking, sharing...  oh, man!  From there, I said to myself: "Ok, ok, I am going to do the new "Hello World", The "One Ring to rule them all":  a TODO list using REST and MEAN.

TODO: Mongoose Schema

The Todo Schema has three properties: title as String, completed as Boolean and createdOn as Date.  Note that createdOn has a default value of Date.now.  Also note that the title is required and that it has custom validation.



The thought of using createdOn was to at some point allow only one Todo of a type per day, like I have breakfast every day only once, the second time one eats breakfast food it is not breakfast time but something else...  lunch perhaps?  I love breakfast food for dinner!!!

In the future I am going to play with OATH and an authentication middleware so different users can have their own Todos.

TODO: Node REST API

What is new here?  Nothing, but it is mine!  It is worth mentioning that this route controls all the REST calls.




TODO:  Server

Here is the server.  The server is using the router middleware.  It is important to note that the router needs to be setup before the 'catch all' route app.get("*", function(req, res){ ... });.  The idea is that all calls are handled with the index page and the REST AJAX done by Angular is handled by the route middleware.

Interesting is to note that the connection to MongoDB is not stating the port number.  I assume that Mongoose tries the default port 27017 first hand and if the service is running on that port then uses it and everything is hunky-dory.





A section of code in the server is commented out.  I was not able to successfully use "flash" middleware with the "jade" view engine to display error messages.  So for now I am sending a json message to the client in the case that there is an error and allow the Angular client to display it perhaps using a popup window using the Angular Modal window service...  it would be ok, I guess.

TODO Controller

The controller is pretty fat I must admit.  It does everything I need it to do however,   Note that the Mongoose Todo has been injected, you probably saw that in the REST router.  Things to note in the controller:

  • Post:  Uses find (query, callback).  If not found then Todo.save(callback) invoked.
  • Get: Gets all Todos using Todo.find({}, callback).  Note that the query is empty.  Using the empty query is optional.
  • Update:  This is to be used with REST PUT.  Obviously the item is updated only if found, so this function uses Todo.findById(id, callback).  The Todo is saved in the callback.  This update is done when the user toggles the "done" checkbox".  I decided to keep the done Todo until purged.
  • Purge:  Purge is a bit more interesting, so let us take it aside.


todoController.purge

The Angular client $http service invokes the API to purge then in the Node controller the Todos collection is queried for "{completed : true}".  If the query returns something then we call Todo.remove on the entire collection BUT note that this is done with only one call to MongoDB and with a query parameter using the $in operator.

The $in operator selects the documents where the value of a field equals any value in the specified array of Todos which have completed ===  true then we call remove on them.

This is very cool.

So, what else can I show you?  Hum...  ah!  Ok, the Angular app.

TODO:  Angular Factory and Angular Controller

Nothing to it, I like how trimmed it looks.  I wish I was as trimmed and strong as one of my factories!  I would look like Terminator, the good one, the one that John Connor sent to protect his mother and himself when he was a kid in T2.  Yeah!  I like that!


and the controller is this one:


The Jade View

If you want to put it all together here the jade view.  It is a little wonky, I have to make it work a little better with errors...  I do not like what I have there. 


I notice right now that I have to move the JS code reference to the end of the file....  hey, would do that for me pal? 

:)

My MEAN Dev Tool

It seems like months since I first downloaded and installed Visual Studio Code for Node.js related development but this journey started just a few days ago on July 30st!  I guess that we get used to good things very fast.  Since then I have uninstalled all code editors:  Visual Studio Code is here to stay on my computer and that is that...  Sure it does not have the super advanced features that other product bring, but heck, it is free and it works!



Fellows, I tell you; it has intellisense, you can debug your Node.js code, place brake points.  I hope that Microsoft continues enhancing this product!




What is Next?

I could make my view better, perhaps prettier.  I want to plug in my Mocha test and add Passport to other team members can have their own Todos.

Ah!  In the case that MUG lets me, I can use this to talk about things at the user group.

Get MEAN TODO From GitHub

You can always clone the project from GitHub; here:



References:

  1. First time I use Visual Studio Code.  This note contains a reference to where to get the tool
  2. Nodding since...  not that long ago:  NodeSchool gave me the bug!
  3. Using $in:  MongoDB $in operator

Sunday, August 2, 2015

Mongoose Schema Validation

Bad Data Is No Better Than No Data:  Then, Validate Ebenezer!

After my previous post, Noding, MongoDBing With Mongoose: On the Account of the MVA and Visual Studio Code, the natural thing to do was to validate the data model before a save or an update.  Let us get to it then!

It is always good practice to validate application data in the logic layer but it should be a hard requirement for any solution to enforce data validation on the model layer just before the data is saved.  MongoDB does not provide data validation, therefore data validation is on us; we are responsible for it.

Using the same Mongoose Data Model defined in the previous note, let us see how many things can go wrong with our data before adding data validation.  So, we had BankData with a collection of Accounts.  Can BankData be saved, updated without a FirstName or LastName?  Can a new Account be added without a type?  Or perhaps add a negative amount to an account?  Oh, yes, it can be done!  The Mongoose Schema has no validation!  That's not good; darn bank!


The client was able to save BankData with no first_name and no last_name and later on it was updated with a new account with no type.  

I get it, I need validation, so what types of validation Mongoose offers?  First off, validation is defined in the schema and it takes place when a document attempts to be saved, after default values have been applied .  Therefore, if our Schema would have contain validation then the above saves, without changing the code at all, would have never work and the error would have been passed to the callback, which means that assert.equals(null, err) would have failed.

Built-in SchemaType Validation

Luckily for me built in validation into the SchemaField can be applied to make these fields required.



If the client tries save the bad data then Mongoose, before the save, executes the validation, detects the error and the data is not saved.  Nice!  But there is more, the String type take s list of allowed values therefore I could place here the currencies allowed by the bank, say USD and CAD only.



Custom Schema Validation

But, what about if this bank accepts all currencies of the world,except... the British Pound! then the above validation would be unreasonable because the enum array for currency would be really, really long.  Ah!  Instead we could use a custom validation and test only for the invalid ones!  How does that work?  Easy!



Line 13 above is passing a validation function and an error type to the model SchemaType.  The currency could be anything but GBP.  This is a bit of a problem because now anything goes but the little £; even made up currencies could be set which is truly wrong.

Mongoose Middleware

Using Mongoose middleware we can validate the account currency type by intercepting the save process in a pre-save event and test the currency set by the client against a collection currencies previously set in MongoDB.  A middleware is like an interceptor which inserts its execution before or after the process execution.  The pre-save would be like catching the save on the fly.  This would be useful for the execution of complex validation or for setting some properties of the object just before save.


A Pre-save middleware executes before the save operation and if everything is OK what happens next is the actual save.


The way to implement a pre-save middleware for the account is like this:





A few things to notice from the above middleware;  First, there is a new Mongoose Schema, the validCurrencyModel (line 2).  This model represents a valid currency used by the bank.  The bank collection of valid currencies might not change very often but I decided to store it in the database for better management.  I'll show you how I batch load them to the MongoDB in a little later.

Note that in line 4 the middleware is executing a query for currency of the account (this.currency) on Currency to find out if the currency set by the client is valid.  If the currency is not found then an error is issued of the type InvalidCurrency  and the next thing that happens is that the error is returned to the callback of the save (line 11).  If the currency type is valid then the middle ware calls next() which means "carry on to the next step, the save, this data is good".

Batch-Loading Currency

A Currency object was created.  Note three things in the following code section:  currency_symbol is required, has custom validation and is unique.



Note that the batch was done by creating an array of Currency and passing it to create.  Create returns to the callback the error and the entire collection of created documents.  Simple.



Conclusion

In this note I have used Mongoose validation in three different ways to validate the Data Model:

  • Validation at SchemaType level.  We can use required, allowed value enums for strings. Max and Min for Number types.  In the case of Currency the SchemaType for currency_symbol was also set to "unique: true" in order to avid duplicates.
  • Custom Validation: By passing a validation function to the SchemaType
  • Using Mongoose Middleware:  Validation of the model in a pre-save event.
This note shows only a taste on Mongoose validation.  You have probably noticed that to test my Mongoose model I have been using scripts which are not very useful.  I feel that the natural continuation of this Node + MongoDB + Mongoose would be to use the defined model in a system through a RESTful API using Express...  hum, I would not be surprised if I soon take on this.

References

Thursday, July 30, 2015

Noding, MongoDBing With Mongoose: On the Account of the MVA and Visual Studio Code

Noding From 0-60 with Stacey Mulcahy, Rami Sayar and MVA

Today, July 29th, Microsoft Virtual Academy had in store a great course, delivered masterfully by Stacy and Rami.  It was a 7 hour event!  During seven hours these fellows were talking about Node.js, Express, databases with MongoDB, debugging and deploying Node.js and extending Node.js with Azure and Web Jobs; wow!.  This course has all it takes to become popular at MVA!

During the first hour Stacey and Rami went through the obligatory introduction to Node.js layering good arguments for event driven programming, asynchronous vs synchronous (blocking), the classic "Hello Node", etc.  They did all these, and much more, using Visual Studio Code, a very sweet editing tool which I think is going to give The Others Big Ones (Sublime, WebStorm, Notepad++, ect) a good run for their money!  Yeah, Visual Studio Code is as light as pancakes and ready to give you a sugar high!  Go get it boys and girls!


Visual Studio Code;  Note the GIT icon it is telling me that the status of the Git repository is 5 new changes.  Sweet!


After the first hour of the course, still an introduction to many different Node related subjects, the event picked up speed and intensity.  I do not intend to echo here anything that Stacey and Rami talked about nor I would change one word, but would like to add my two cents.


MongoDB Without Mongoose is Like Hot Chocolate without Marshmallows! 

I just think that they needed five to ten more minutes to really complete the material by adding Object Modeling with Mongoose to their MongoDB + Node example (chapter 13).  I feel that using Mongoose changes the nomenclature of the subject and completes it; although I understand that after 7 hours of talking and explaining and running demos 5 more minutes would feel like an eternity!

Mongoose provides a simple schema based solution to help us model our application data and includes built-in types, query building, validation and much more just out of the box.  Built-in queries help us filter, and find data very easily.  Let me show you what I mean; but first you will need to install Mongoose.

To install Mongoose you need to run on the command line:

npm install mongoose --save

The save flag modifies the package.json for Chapter 13 of the course and adds to it a new module; Mongoose.

Mongoose Schema:  Building the Model

With the help of Mongoose the following code defines two entities for our system:  Account and BankData.  I am using the same definitions that the course lays out and only providing object data modeling as an add-on.



Note that BankData has a collection of Accounts which is encapsulated by the "acconuts" array property on bankDataModel.  Note that the model contains an Account and BankData defined by Mongoose.  To add this Has-a collection of accounts to BankData Mongoose tells us that we need to define an Account Schema.  Take a look at how it is done from line 6 to 10; then on line 15 BankData Schema describes the collection of accounts.

Lines 19 and 20 tells us how this mapping is done by Mongoose and we build a model containing our Mongoose object models.  Finally the script exports this model so it can be required from the application.


 Mongoose it or Loose it

We are now ready to consume the model.  Line 3 requires it.  Note that now this app does not deal with MongoDB at all; Mongoose is doing all the heavy lifting for saving, finding, querying.  The only thing we need to do is provide the URL for the connection to the database.  This is assuming that MongoDB is installed and running on your system.




The above script is just demonstrates how to use the model to Create, Read (find), Update, Delete (CRUD) using Mongoose.  You can compare this code to the code provided at the course to realize how sweet Mongoose is.  Some facilities provided:


  • model.save(callback):  saves the entity model to the doc database.  save takes a callback.  After the item has been saved the _id of the item is set. 
  • model.find(callback):  Finds all items.  find also takes a query argument which is used by Mongoose to filter and a callback taking this form model.find(query, callback).
  • model.findById(id, callback).  Do I need to say more?
  • model.remove(callback).  You know what this does.

My two cents are that adding object modeling with Mongoose to MongoDB for Node facilitates our work, adds semantics and helps our code concentrate on our business cases.  And this concludes my little tribute to this MVA magnificent course.  This is not, by a long shot, all that Mongoose has to offer and I would not be surprised if I find myself writing more about this in the future. 


References

Sunday, July 26, 2015

Robots With NodeJS: Is there Anything JavaScript Cannot do?

Nodebot Challenge Complete!

This is what one would like to see after one starts the challenge.  Yeah!

Sweet Mama Screen!

Yesterday the NodeSchool ran another global event.  It was an international NodeBot's day baby! Take a look at where all workshops were taken place; all over the world!  It makes you feel that you are really connected!  No language barriers, JavaScript, Node was the common language.

However, on my way to the challenge in Vancouver, I had car problems, frustrating!  I had no choice but to complete the NodeBot challenge at home after attending to my wreck.  I missed it this time!  Next time I'll plan to take public transit!

Ah man!  I do not know what is it with JavaScript:  so much enthusiasm, it is intoxicating.

Weak Module and Python 2.7.10:  Pre-Requisite Software for RPC Challenge (Remote Temperature)

The challenge mentions that the dnode module needs to be installed.  This is true, however not enough and the moment you test your solution for the first time you will get an exception like this one:


No weak?  No Go!

The exception message it telling me that the "weak" module is missing.  If you run "npm install weak" that also breaks because, the module "weak" needs to be rebuilt on your computer and for this we need Python.  You may have Python installed, and included on your path, but building weak only works with version 2.7.10.  So;

  1. Install Python 2.7.10,  Include it in the path.
  2. Run "npm install weak"
Now you are ready to verify your solution for remote temperature and if you got it all right then it will pass.  The interesting part of this challenge is the RPC (Remote Procedure Call) on a robot.  This is nuts!  Nice nuts though :)  Read the documentation and look at the examples...  you will inspire your solution from there.

My Favorite Challenge?

Definitively the "Fire Alarm" challenge was my favorite one just because it works just like the one on my wall!!!  Man, and it is JavaScript!!  I pondered about posting my solutions, then thought that it will not help you get started.

Drop me a line if you need help and I'll be glad to exchange notes with you.

There Is No Limit To Our Imagination

Recently my family and I visited Vancouver Space Center.  Boy, is that the place for robots!  This morning at breakfast we were chatting about building robots to better ourselves and improve our lives.  Yeah, we can do that...

...I cannot wait until I get my hands on one of those Arduinos boards!

Wireless controlled robotic hand made with Arduino Lilypad.  This picture was taken from Arduino Blog

References

Wednesday, May 27, 2015

Node Your Business!

Node Baby, Yeahhh!

To think that about a week ago I was searching for ways to learn Node and how to get started!  Learning Node went from a curiosity, kind of like yeah baby, JavaScript on the server is cool, to a necessity in front of the task of creating Lambda JavaScript micro services on AWS!   Yikes!  Too many unknowns in one paragraph drives me nuts!  True, and on May 22nd I found that on May 23rd NodeSchool was holding an international workshop...  on the entire planet Earth, to help anyone showing up the basis of NodeJS, and that there was only one seat left!!!   ...and I got it!!!




What were the prerequisites for this event?  None, not even knowledge of JavaScript.  There were teaching JS to whom ever said that knew nothing.  Crazy or what!

The event was terrific, and it was held at the ultra modern Zillow's Vancouver Yaletown office.  They had tutors at hand; people helping people!  We had pizza and sugary drinks but the most important thing was that enthusiasm was plentiful and we all knew that we were accompanied by hundreds, if not thousands of other nut heads, all over the world learning NodeJS.  Awesome!

Did I learned NodeJS?  Yes I did.  Here is one of my solutions to an exercise.  The problem was to asynchronously collect responses from three different URLs, which are passed to the Node script from the command line.  The script was supposed to collect responses and print them to stdout in the same order requests were sent to these servers.  The problem is that these servers will not respond in the same order.

There are different ways to implement a solution to this problem.  In fact, this is my third solution where I decided to use promises because they just felt appropriate for the scenario.

Hey, drop me a line, do not be shy!


References

  1. NodeSchool:  http://nodeschool.io/
  2. Install NodeJS:   https://nodejs.org/download
  3. A simple implementation of Promises for Node:  https://www.npmjs.com/package/promise
  4. Install NodeSchool NodeJS Tutorial:  https://github.com/workshopper/learnyounode
  5. Download Git Bashhttps://git-scm.com/downloads
  6. Free AWS Account (12 Months)https://aws.amazon.com/free/
  7. AWS Lambda:  This is not a Harry Potter spell, no.  It is a compute service which runs your code in response to events.  We only write the actual body of the event handler then we tell AWS Lambds: hey, this is my handler and I am interested on this kind of event which could be a database update, or messages arriving in AWS Kinesis (an AWS stream which is like a queue but not really), or custom events from your applications.  The boy of this handler function could be written using NodeJS.

Sunday, March 1, 2015

Azure Web Sites Demistified

British Columbians were today enjoying one of those delicious days, perfect gift, with clear skies and temperatures around 10 C (50 F); undoubtedly nature is wrapping up winter and spring is starting to show its first signs.  I could tell Vancouverites were talking advances on the climate because traffic was very light on my late afternoon drive home.  I am glad that folks can enjoy these little bits of life, even when I was not:  I was at a Microsoft Azure Boot Camp sponsored by Microsoft Azure, BCIT, and the BC Dot Net User group.  So, I told myself that "if I was up to facing this kind of sacrifice then I better learn this stuff and get on using it"...and...  you know, I think I did learn some!

In this note I want to document what I learned today and I would like to start with Azure Web Sites.  Yeah, that is a good start!

Creating Web Sites: Azure Site and Source Repository

For this note I am using Visual Studio and Team Foundation Server online,  One thing that I learned the hard way today was that to create a TFS Online repository the relationship between the web site repository and the Azure Web Site must be One-to-One.  This makes perfect sense to me now because when creating the site the tooling does not ask you for a specific project of your repository, it just asks for the repository name.

IN order to create an Azure Web Site a workflow that I found convenient was to condition a TFS Online account first containing the source code of the site follow by creating the web site in Azure

Conditioning a Repository in TFS Online

Creating a repository on TFS Online is very simple.  Login to your account and create the repository.



After you press the Create Account button you need to create a project.  Note that you have an opportunity to use Git or Team Foundation as version control.  I selected Team Foundation. 



Note that the account repository created for this project is "https://a1manblog.visualstudio.com/" and we are going to need this URL when linking the Azure Web Site to its source control.  After creating the account TFS Online offers the following dashboard.



If you have Visual Studio 2013 then you can click on "Open in Visual Studio" link to open the project in Visual Studio.  Do not trouble yourself in the case that you do not have Visual Studio 2013 installed because you can always install Visual Studio 2013 Community Edition, which the same as the Professional Edition (wow!) with licencing limitations; then click "Get Visual Studio" to learn more.

But assume that you do have VS 2013 and that you opened the project.

It is very convenient to map your project to a local directory at this time.

Creating the Project in Visual Studio

You can now create a project.  Visual Studio will ask you to choose to add this new project to Source Control and you need to select that option.  

Complete the project creation, build it and run in locally just to make sure it is all nice and dandy.  Once you are satisfied check in the project.

A Drops Directory

Optionally you can create a "drops" directory directory where successful builds are deployed.  I could not create the "Drops" directory directly on TFS Online (Visual Studio Online) and could only create this directory on my local copy of Visual Studio.  In Visual Studio > Source Control Explorer tab right click on the project and create a new directory named "Drops".

Create Drops directory and Check in

You can place builds here by editing the Build Definition file

The created Build Definition file is already set to Continuous Integration under "Triggers" by default when it is created using TFS online.

Azure Web Site

It is time to create the web site in your Azure dashboard.  Login into your account and select the "Web Sites" tab.  Note that Microsoft Azure is working on a new dashboard, however for this note I am using the current one:


Click on "New" on the lower left corner and select Custom Create.



In the Custom Dialog window provide a unique name for your site and select checked "Publish from source control"



In the next step Select the Visual Studio source control provider and advance in the wizard.  Now it is time to authorize the connection by providing the address of the project in TFS and clicking the Authorize Now and accept the authorization link...  you are almost there!




After finishing the wizard Azure created the web site.  This takes a couple of minutes and after a successful creation the Web Sites dashboard shows the newly created site.  




Click on the name of the web site and the tools view now show all the thing that you can do to tune and configure your web site.  Note the "Configure" tab which offers common features available through IIS Manager: be curious and explore and you will recognize the functionality available right away.

Go crazy and click around; how else would you learn what is there!  Explore!  You can scale your site; take a look at the features which are self explanatory and have help notes associated to them.




However, we are not done yet and to finish our deployment we need to do one more thing.  On the above screen access the "Deployments" tab.  Azure now mentions a few things that you need to do when using TFS like for instance add the project to source control and check it in.  I really like how the Azure team provides "how-to" instructions to guide us all the way.  There is no way we can fail!


In the case that you have not linked your project, Azure provides easy steps.

Wait a minute!  We have done our check in already, continuous integration works with a check in (checkin-build-deploy).  In that case I went back to Visual Studio, edited my project and check it in.


Build in Progress after a successful check in.  Be patient because it takes awhile before you see this screen!


This is nuts!  TFS online is now building and deploying my site!  I did not even had to create a build definition, it is all done for me!  The default one is great! 

Keep in mind that if you are using the free option for TFS Online we only have 60 building minutes available per months.  If we want to increase that then we have to reach for our wallets...  but is not scary!

When the build and deployment is completed you see this screen:



Hey!  You are right; I have done a couple of check in and with every check in TFS built it for me and deployed it to Azure for me as well!




What about that Drops Directory?

You caught me!  Well, yes, one does not really need this directory because TFS and Azure are doing the heavy work for me managing my builds and deployments; all default settings are pretty good and they might be sufficient to get you going.  I can use this directory in the case that I create/edit current Build Definition file and indicate that I want to manage a copy of my builds and to place them in that location.  This is optional.


Deployment Alternatives

There are other ways to deploy a web site to Azure Web Sites.  The site can be created first, then add it to Source Control (TSF Online) followed by deploying to Azure from Visual Studio.  This way is very easy as well and you need to have your Azure authentication login handy.


What is Next for Web Sites?

Oh man!  There is so much we can do and at least this note got me started!  By inspection of the Web Site dashboard you can see all the other features that we have not touched in this note.  Some of them are:




I do thank Microsoft, BCIT, the BC Dot Net User Group for organizing the event and the speakers...  and my good buddies Nora, and Brian, and Medhat, and Jef, and Sergei.

Yeah, look at the time I am writing this, I am fired up and not even sleepy!


Conclusion

Getting started using Azure Cloud Web Sites is very simple and there is something for every one.  I found that TFS Online, Visual Studio Online, integrates very well with Azure and I had no problems using this integration.  I did find that it takes some time for the Azure Web Sites "Deployments" tab to get updated after a check in has completed in Visual Studio.  This might create some confusion to the novice, like myself, but once you discover that "issue" it is very easy to use this delay as an excuse to get yourself a cup of coffee after a successful check in...  you deserve it anyways!

And there is more, much more!  Check out the references section of this post.  Another resource is the Microsoft Virtual Academy with very high quality videos and courses.

I will see you around learning Azure and godspeed in this journey!

References

  1. Microsoft Azure:  Register and you might be able to enjoy an introductory 3 months free.
  2. TFS Online:  Free TFS offered by Microsoft.  Visit the TFS online to learn more about this service.
  3. Web Site Scaling:  http://azure.microsoft.com/en-us/documentation/articles/web-sites-scale/
  4. Web Slots, Staging: http://azure.microsoft.com/en-us/documentation/articles/web-sites-create-web-jobs/
  5. Traffic Manager:  https://msdn.microsoft.com/en-us/library/azure/hh744833.aspx
  6. Hybrid Connections:  http://blogs.msdn.com/b/golive/archive/2014/11/21/azure-hybrid-connections-connect-to-protected-resources.aspx
  7. Redis Cache:  http://azure.microsoft.com/en-us/documentation/articles/cache-dotnet-how-to-use-azure-redis-cache/
  8. The World Greates Azure Demo, video by Troy Hunt; excellent! https://www.youtube.com/watch?v=7V8HikBP1vQ
  9. Microsoft Virtual Academy Courses:  http://www.microsoftvirtualacademy.com/Studies/SearchResult.aspx?q=Azure%2bweb%2bsites

A1 Repo: Simple Pagination for WCF Service Operation

It is never a good idea to paginate on the client.  This post is about a simple pagination for a WCF service operation.  For this work I nee...