Showing posts with label model. Show all posts
Showing posts with label model. Show all posts

Monday, July 7, 2014

How Change Events Are Triggered In Backbone

I was having trouble understanding the various states that a given Backbone model, and all it's attributes, might be in. Specifically, how does an attribute change, and trigger the change event for that attribute. The change event isn't triggered when the model is first created, which makes sense because the model had to previous state. In other works, creating the model also creates a state machine for the model attributes. However, this can be confusing if you're adding a new attribute to a model.

Wednesday, July 2, 2014

Backbone: How To Define Model Properties

Backbone models are kind of like plain JavaScript objects in that they have attribute names mapped to attribute values. In fact, you generally instantiate models by passing in a plain JavaScript object to the constructor. The difference, however, is in how you read and write Backbone model attributes. There's the set() and the get() method — an indirection not exhibited by plain objects.

Thursday, March 27, 2014

Parsing Dates in Backbone Models

The parse() method on Backbone.Model is useful for parsing out using model properties. These are typically things like dates — the API doesn't return a JavaScript Date instance — it's either a number or a string representation of a date. That said, it's not terribly useful to store these strings or numbers in date properties of models. For example, using parse(), you can add a level of certainty in your application that the date fields on your models will always be Date instances, and not numbers or strings.

Friday, February 28, 2014

Fetching Backbone Models The Safe Way

If you're using Backbone, odds are that you have several detail views for various types of objects. Before those views are rendered, you need to ensure that model has been fetched. Otherwise, you don't have anything to render, besides the template, perhaps with placeholders filling in for the real model data. Or maybe your application has already performed a fetch on the collection, in which case, we have all the data required to render the detail view of the model. But, unless you're keeping that collection synchronized, you could be rendering stale data.

Monday, April 18, 2011

Pick Your Cloud Model

Cloud computing is a category your application is either a part of, or it isn't. We might even go so far as to call it a paradigm, a way to classify software – your code is either object-oriented or something else, like a functional paradigm. As a model of computing, cloud means a lot of things, its too broad to describe anything particular. Does your car fall into the category of “vehicle”? Yes, but that doesn't tell us much about the car. You can't exactly tell someone you travel by means of “vehicle” without getting a laugh or two. You can, however, say something like “I drive a four-wheel drive truck”. This is something the other party can relate to, sparing the technical jargon. Perhaps there isn't any further-refined sub-genres of the cloud just yet. Maybe a good starting point is looking at what we're hoping to accomplish with our software. Maybe then we can better identify auxiliary cloud models, and when it makes sense to use one over the other.

The term cloud computing is notoriously ambiguous. With good reason too, I don't think there is an easy way to define it as it means different things to different people. Software systems that utilize cloud technology have their own vision of what the cloud is exactly. Rather than trying to define an all-encompassing definition of what cloud means, perhaps the better approach is to identify some of its well-defined components and try to piece them together as something that is a “kind of” cloud. Why do we need to stuff everything we do over the Internet into a single pigeonhole? I say leave it vague – the cloud means events that take place over network-connected nodes. Let's focus the more tangible models of cloud computing.

The cloud's most compelling pronouncement is budgetary - saving money. We can use cloud provider's resources for less than it would cost us to buy, setup, and maintain our own. There are other, technological benefits to cloud computing too. Providers give us APIs we use to store and retrieve our data, to control our virtual machines, or to let someone else do it for us. How does this technology coalesce with your business goals? Do cloud offerings somehow solve deficiencies in your software?

One universal problem throughout all information technology is lack of physical hardware resources. The goal of any business is to scale up their operations, meaning that they have a large demand for whatever it is their software does. Say you're operating an online store and you've got a lot of customers. The software you started with isn't going to fulfil the demand of your newly acquired customer base. You've now got a scaling problem, which is in the most optimistic sense, a good problem to have. The need to scale up equals lack of compute resources – CPU time, memory, storage, and bandwidth.

The virtual cloud model can help with provisioning more physical hardware resources as they're needed. With this model, you deploy your existing application, without modification, to a cloud provider. The is no need to hack your software for the sake of potability. As long as you can get it running as a virtual machine you can deploy it to the cloud. When your software is virtual, its no longer at the mercy of the hardware. When your hardware fails, always a safe assumption that it will, you've got other copies of your application – new copies can be cloned as required. Of course, I'm over-generalizing a little when I say you deploy to using the virtual model with no modification. There is no such thing as software that knows how to adapt to a new environment and perform optimally without error.

Having said that, as a consequence of virtual machines being so easy to deploy, so readily available, so cheap - they're replaceable components in this model. You can find yourself a virtual machine that plugs into your infrastructure. This is probably something specialized, something that serves a particular purpose, like a database server or a cache node. The virtual cloud model promotes reuse and interchangeability.

Whats different about the virtual cloud cloud model from anything we've done in the past is running your entire infrastructure on a service provider's hardware. Not just a few services, but the whole thing. This can significantly lower your operating expenses, absolutely. If you don't have any hardware to operate, you've essentially eliminated that cost. The virtual model of cloud computing isn't for everyone, because not everyone is capable of migrating their existing software to an entirely new platform. Even armed with the know-how, would you really want to do it? In addition to the risk of carrying out such an endeavor, you've still got to go through the process of migration, even though you don't need to write new code.

The alternative cloud model, as I see it, is that of the service model, the service-oriented architecture if you will. The service model is fine grained while the virtual model is coarse grained. Services offer small bits and pieces of data and functionality where the fundamental unit of a virtual environment is that of an operating system.

There are all kinds of services available on the web. The web itself, that is, individual web sites offering content are a prime example of a service. If you subscribe to the opinion that the cloud is nothing more than an acronym for the Internet, you're probably a service-minded person already. The service approach solves the same basic problems as the virtual approach – finding ways to cut computing costs and provisioning more resources at a moments notice. Instead of offering virtual machines, the provider offers a service, another type of resource analogous to a virtual machine.

Storage is the most common service you'll find on cloud providers. Your application makes an API call to store a chunk of data. It makes another call to retrieve that chunk later on. An interesting usage scenario is using cloud storage APIs as your secondary storage, as a replicated copy of your primary data hosted elsewhere. The service cloud model isn't just about storage, its about providing an API for anything, any computation that you don't want to perform locally. Maybe you can't perform it locally because you cannot keep up with the demand. Or maybe you simply do not have the required software to do it. Imagine a charting API that produces chart images based on supplied input parameters. The Google chart API does just that. All the client cares about is producing the necessary input data for the chart. Generating the chart image is up to the API, my input parameters processed somewhere in the cloud.

What are the differences between the two cloud models? How do I know which one to use? Do the two models have anything in common with one another? The key difference between a virtual cloud model and a service cloud model is the application it self. Take a photo printing store, for instance. This store allows users to upload their digital photos to be printed and mailed to the customer. This business's website needs several different components to operate. It needs the website itself, this includes things like the user interface, static pages that display information about the company, the services offered, and so on. The facility to upload photos is component on its own, the service for processing payments is also a separate component, along with the facility to queue the physical photos for printing.

Let's now imagine we're going to deploy our photo store using the virtual cloud model. Each component, the website, the upload facility, the payment gateway, and the print queue, they're all virtual machines. Each component of our photo printing system is an operating system that can run in any virtual environment. If one goes down, it can be replaced with another copy of that same component. In our own local environment, we create the photo print queue and test it. This same utility gets deployed, without modification to the cloud service provider. However, deploying without modification is a little misleading. Our components still need to orchestrate with one another, not a trivial task unless I've found some system to facilitate the communication amongst the different photo printing components. Unlikely given that this is what makes our service unique and thus competitive. I've got to write something that does this.

The alternative method is to employ the cloud service model approach. Rather than deploy each of our application's components as a virtual machine, we deploy the photo printing website on our own host, calling different service APIs from the cloud to help our application along. Maybe we call a storage API to save uploaded photo files. Maybe there is a cloud API we can use use to queue the photo print jobs, a message queue perhaps. There are a ton of payment gateway APIs that we can use.

So which model is better? There isn't a definitive measure of value when comparing the two models with one another. Not without a context. The question instead, should be something like, which model is better suited for what I'm trying to achieve? How does cloud technology help me when my own resources are running dry? You have to ask yourself, how equipped am I to operate a virtual model and is it really just a matter of deploying my software without modification? Or is there a lot of mediation code I'm going to need. Maybe something like this already exists, how much will it cost?

If the virtual model doesn't help you, if you're not technically capable of operating such a setup, or maybe its just not cost-effective, you can look at the service cloud model. The service cloud is all about providing APIs to applications, APIs that offer something of value to the application. Another factor favouring the service cloud model is deployment and availability. If my software utilizes one or more cloud APIs, they must have already existed before my application decided to use them. That is, after all, the value proposition of the provider offering the API. Consequently, APIs should always be available because this is the reason for the provider's existence, to offer applications such as ours, compute resources on demand.

The cloud service model might also be lacking depending on what you're trying to build. Consider our photo printing store. We've got some fairly well defined components that will help it match the customer demand. We've come to the conclusion, during the design of our system, that we need APIs that do certain things. Odds are we'll be able to find a payment gateway that will suit our application's needs just fine. But what about the printing queue component, the one where we send photos to be printed? This is a requirement specific to what we're trying to do with our own service. So we're going to have to build our own service to do something like this.

Can the two cloud models conspire to offer real value? My thoughts are, one can serve as an adherent to the other. Take our photo printing pursuit for example. We've examined what it would be like running in the cloud as both a virtual cloud model and as a service cloud model. Both have their advantages – the virtual model is gives a clear distinction between components and total control as they're created by us. We can also find virtual components off the self that do what we want. The service model is lightweight, as in, we don't have to deploy changes to it because they're hosted by the cloud. We can design our applications with these cloud service APIs in mind, giving use fine-grained control over how our software interacts with the cloud.

The two models can certainly help one another out. A solution for our printing service job queue component might be to implement our own virtual service. That is, a combination of a service and virtual model where a virtual machine, implementing the job queue service we're looking for gets deployed and we have total control over it. We're taking advantage of the control and flexibility of deploying virtual while keeping the design philosophy of our application's features and using an API, service driven approach to doing certain tasks.

In the end, there are probably several more of these sub-genres of cloud, we can concoct and describe once we've encountered real problems and devised real solutions to them. We simply use the higher-level, vague cloud term for inspiration.

Sunday, May 2, 2010

Modeling Existing Systems

Modeling software is an expensive activity. Building software models can be expensive for a multitude of reasons. The most obvious being the time lost for unsuccessful designs. Software models take time to put together. Unless you are sketching, the goal being to not spend a lot of time on them. Contrast this with a failed initial software implementation with no up-front modeling. There is a high probability of reusable software components that may be salvaged from the failure. Developers tend to become fixated on functional software. If something that at least somewhat functional comes out of a disaster, it will still help with morale.

So, if detailed modeling has no place in up-front software development, does it have any place in software development? Creating a detailed model of an existing software system may prove valuable. The key to modeling existing software systems is to pin-point exactly where the design has gone wrong. This is nearly impossible to spot during initial development. Especially under time constraints when the priority of design takes a back seat in favor of a functional system.

The obvious drawback to up-front modeling in software development is that promotes a waterfall approach. The waterfall approach doesn't work as well as an iterative and incremental approach, if at all. This doesn't mean you shouldn't be spending any time modeling, it means any up-front models created should be treated as informal at best. Perhaps it makes more sense to not even call them models because that suggests a level of formality we want to stay away from initially. Simple diagrams treated as sketches are a good practice to follow early on.

Formal models of software systems created after the system in question has been deployed in a real environment can prove valuable. The reason for this being that you have a functional system that is relatively stable. This is because after a system has been deployed for a while, hundreds if not thousands, of smaller bugs have been eliminated. These are the types of bugs that a software model isn't likely to solve in a timely manor. With the smaller issues mostly removed, we are free to tackle larger design issues that aren't easily solved with code.

One of the first things you might want to model are the dependencies between the packages and modules that make up the system. I've tried this before and was amazed at the problems I was able to see before even trying to model inheritance between classes. When you have several dependency lines crossing one another, it just looks bad. This is often a reflection of a sub-optimal design. The poorly painted picture provides quite the motivation to fix these issues. The same goes for class hierarchies. The relationships among the elements in the system, not just inheritance, are worth modeling. The value of modeling the details of each element isn't as high. Encapsulation also applies to software models to a certain degree.

Monday, March 15, 2010

ArgoUML 0.30

It looks like ArgoUML 0.30 is now available for download. I look forward to testing it out. This is another step toward UML 2 compliance with open source software modeling tools.

Wednesday, March 18, 2009

3D data and 3D UML

An interesting and experimental idea: The ability to design three-dimensional models using UML. This idea is shard among some visionary coders who have already demonstrated the ability to model UML in a 3D model space. Diagram elements can be layered, placed, and rotated in a three-dimensional manor. Glasshouse is another cutting edge GUI for visualizing relational data sets. Users can use SQL or spreadsheet data as input, and Glasshouse will present the user as an avatar within a 3D environment. The avatar then acts as a the user within the data environment, allowing the user to manipulate the data in an interactive way never seen before. The remaining question is, what is wrong with the current standard 2D visualization of data and UML models today?

The answer is that there is nothing inherently wrong with viewing data in two dimensions. The same holds true with UML models. Before the graphical user interface, common on most desktops today, there was the command line. There is also nothing inherently wrong with the command line. However, the GUI was invented for a reason, so that human users can quickly comprehend what is displayed in front of them. Trying to grasp a relational data set that is displayed in the console is possible, although it would most likely take a seasoned professional two weeks to understand it fully. If that same data set is presented graphically, many more features become available such as moving windows around etc. With a GUI it might take that same professional a day or less to fully understand the data. Now, imagine trying to display, edit, and understand a modest UML diagram in the console. For humans understanding data, the GUI was the a big fist step and understanding UML models followed shortly. The next step is another dimension.

The Glasshouse project is good example of how this first 3D data manipulation interface might be taken. Insights about data sets will most likely be made possible that never were before. Users much more freedom in the perspective in which they view the data. Will the UML be able to follow this direction? Some tools have already started by making individual diagrams rotatable and stackable within the modeling space. This is where the 3D functionality in the UML ends. There is currently no tool that offers 3D UML elements such as classes, objects, or interactions. Could avatars be used as actors in use cases? For instance, when simulating a use case realization, the avatar (the actor) could actually move about in the collaboration among the other 3D UML elements. An instance of some class could expand and contract in three dimensions according to how many resources it is using.

These are some incredibly complex design challenges to implement. Building a two dimensional UML modeling tool is by no means trivial. Building three dimensional interfaces are also not trivial. Combining the two could take decades just to get a functional demo working. Is something like this worth the effort? Would this new UML 3D modeling interface produce better software, faster? In the end, this amounts to a tough decision to make because of the risk involved. But that hasn't stopped other ingenious software projects from being built in the past.