INightmare's Blog

How to Survive the AI Revolution

AI is about to take a lot of jobs.

There are two categories of people:

  1. Those that think this is going to be a slow process and we will slowly adapt
  2. Those who think this is going to be a fast process and we won’t.

Well… I’m in the second category.

The AI advancements over the past year are huge. In software industry AI is already achieving productivity gains by an order of magnitude.. We are not very likely see demand for software increase tenfold. That means staff reductions are going to follow once the companies catchup to the tools and procedures.

Software is just the first to fall, as we have a lot of data and good feedback loops to train the models. But AI is also good with law, writting, data analysis and many other things. And as more subjects get attention of model “trainers”, it will get good at those too.

White-collar jobs are going to shrink and by a lot. So, much and so fast, that I believe this will trigger a slump in demand, which will take other sectors with it and trigger a recession. And this time, one that we might not even know how to grow out of.

So, what do we do?

I think for high-value economies like US, EU, Canada, South Korea, Australia, Japan and others there is still some hope, at least in the short term: move to sectors that will (for now) be less affected.

For this, 3 things are needed:

  1. High debt-fueled investments into onshoring and building out blue-collar work base. Bring back full supply chains from mines, to refineries, to manufacturing. There is still a lot that can not be fully automated and at current automation level workers are actually going to be quite productive.

  2. Invest into reeducation, pay to get educated (largely the case in EU already), we will need the workforce to adapt to working in industries we bring with #1.

  3. Follow US and protect yourself with tarrifs against outside competition in those industries. This will reduce requirements on #1. Yes, this is inflationary, but recessions are deflationary, so this will somewhat balance out. This is a tax on survining and so - well worth it.

In the longer term

I’m not sure anyone has a clue. Most jobs are automatable. You can have full supply chane be automated: robotic miners, loading driverless trucks delivering material to unattended factories.

As we go towards the future, maybe Keynes ideas of lowering working ours to match increased productivity may be another backstop. But all world needs to agree on this, or at the very least, economic zones that do that need to close off the walls from those who don’t.

Lift Framework

Today I want to write a little about Lift Framework. It is a web application framework targeting Java web application containers written in Scala programming language. Scala is gaining popularity and according to various language popularity ratings (TIOBE and Transparent Language Popularity Index) it is rising in popularity and at the time of writing popularity index is over 0.23% in both metrics. It is a good result considering how much attention programming languages are getting in general. So considering this it seems natural that someone would write a web framework based on Scala. Especially taking into account that Scala is often called “static dynamic language” and even compared to the likes of Ruby. The later got much attention when the famous Ruby on Rails framework was released. Can it be the case that Lift will mean the same for Scala? Might be. Twitter has moved from Ruby to Scala, Foursquare have ported their application to Lift Framework and new Novell Vibe solution was all written using Lift.

What I like about Lift?

View-centric approach. The concept is somewhat similar to JavaServer Faces, where you have a view and reference components and beans from that view. That allows you to compose complex views and keep their logic separate (in case of MVC your controller will be a hub to all the services and data you are passing to the view, so to add any new element to the view you need to update the controller to fetch the data and update the view to make use of it). But in Lift things are a bit different. The view is a simple HTML template, that can access snippets (snippet is a peace of code that can generate content, like a component).

For example lets define a view:

<div class="left:My.text"></div>

Now we need a snippet:

class My {
def text = "*" #> (<b>Some text</b>)
}

He we tell “replace all content with Some text“. We define a CSS selector “*“ and then fill the matching element with the markup. Since Scala natively supports XML, we can pass the XML content with no special markup. So, you may ask “If the markup is complex I am suppose to write it all in my snippet?”. And my answer is “not at all”. By using CSS selectors you can enter values in specific places. For example we have a view:

<div class="lift:My.complex"><span id="text"></span><b id="boldy"></b></div>

Lets fill span and b with some text.

def complex = "#text *" #> "Normal text" & "#boldy *" #> "Bold text"

And of course you can iterate and repeat the content, build tables and so on. This approach is especially convenient for ajax requests.

Ajax and Comet support. Lift allows you to create ajax actions and update content on the page with 0 JavaScript. It does this with the help of helper methods that generate markup and JavaScript for invoking AJAX calls and of updating the page. You can also pass your own JavaScript that gets executed after an ajax request. And by supporting Comet, Lift allows server to push the changes to the client when required.

Scoped variables. That is a functionality that is similar to using Spring AOP scoped proxy. In Spring you define a bean, set scope to session or request and inject to your controller. In Lift you define a property of type RequestVar or SessionVar to hold the value. For example sessionVariable is session scoped:

class My {
object sessionVariable extends SessionVar[Box[String]](Empty)
}

Box here is a value holder that can be empty or contain a value, in our case - String. Lift makes use of Java Servlet technology and session scoped variables live in application session. And ofcourse you can have fully stateless application as Lift has no additional state.

Lift also comes with two persistence solutions a Mapper ORM and a broader solution (Record) that also supports document stores, like CouchDB and MongoDB.

You can see some more examples at Lift demo page.

Conclusion

If you’re looking for the next web application framework, Lift is definitely worth looking into. It is fast, backed by functional programming language (which gives nice things like mixins (Scala calls them traits), function objects), actively developed. And fairly documented - has two free books, wiki and some blogs about it. I say fairly, because in this respect there is certainly room for improvement.

Exporting JGraph to SVG

Yesterday I was facing a problem of exporting JGraph graph to a SVG format image. Neither SVGGraphWriter, neither a solution presented in JGraph tutorial didn’t work.

Using SVGGraphWriter from JGraph I got all of my graph nodes and connections just dumped one on the other. JGraph manual (can be found here, page 97) proposes using Apache Batik and painting the graph using SVGGraphics2D provided by the library. The problem is that

JGraph graph = ...
SVGGraphics2D graphics = ...
...
graph.paint(graphics);

just doesn’t work. So I decided to explore JGraph source and see if there is any way I can make this work.

And voila few minutes and I found a solution:

OutputStreamWriter writer = ...;

Object[] cells = graph.getRoots();
Rectangle2D bounds = graph.toScreen(graph.getCellBounds(cells));</code>

DOMImplementation domImpl = GenericDOMImplementation.getDOMImplementation();
Document document = domImpl.createDocument(null, "svg", null);

SVGGraphics2D svgGraphics = new SVGGraphics2D(document);

svgGraphics.setSVGCanvasSize(new Dimension((int)Math.round(bounds.getWidth()), (int)Math.round(bounds.getHeight())));

RepaintManager repaintManager = RepaintManager.currentManager(graph);
repaintManager.setDoubleBufferingEnabled(false);

BasicGraphUI gui = (BasicGraphUI) graph.getUI(); // The magic is those two lines
gui.drawGraph(svgGraphics, bounds);

svgGraphics.stream(writer, false);

Ninja Aspects

Aspect oriented programming is by no doubt a powerful tool when used correctly. However, it can lead to problems when using, what I call - ninja aspects.

Ninja aspects are aspects, that are there in the code base waiting to strike without a programmer even knowing it.

A popular way to use transactions with Spring is by using its @Transactional annotation. Annotation tells a programmer, that annotated method (or all class methods if annotation is at a class level) should be decorated with transaction handling code. Annotation is a very visible way of saying “an advice is applied here”.

class Test {
@Transactional
public void transactionalMethod() {
// ...
}
}

Lets take the same example and apply a ninja aspect.

<bean id="test" class="Test"/>

<tx:advice id="txAdvice" transaction-manager="txManager">
<tx:attributes>
<tx:method name="transactionalMethod"/>
</tx:attributes>
</tx:advice>

<aop:config>
<aop:pointcut id="testPointcut" expression="execution(* Test.*(..))"/>
<aop:advisor advice-ref="txAdvice" pointcut-ref="testPointcut"/>
</aop:config>

Now, we don’t need the annotation and transaction handling advice is still applied. This makes the programmer looking to the class Test unaware of such behavior. It is possible that the mentioned programmer changes the body of the method in the way, that it does not require transactions (for example, instead of taking information from the database it now uses a web service), but he is not aware, so transaction handling stays. If the aspect is transactions, we just loose some performance (potentially even call the database to commit the empty transaction), but aspects might be more serious than that. They can lead to unexpected (from the unsuspecting programmers point of view) behavior and make the code harder to understand. Of course, we will deduct from the stack trace, that the class is decorated with an AOP proxy, however that leaves us with the challenge to actually locating the advice.

It is very important to make all aspects visible, otherwise they can strike from the shadows and leave an unsuspecting programmer wondering, what the hell happened…