I watched this really fascinating video on prime spirals recently and decided to see if I could write my own code to generate them. It's not terribly efficient but it works! Code is on github! :D
Tuesday, 18 August 2015
Dung Beetles are Extremely Cool
I've noticed little bugs scurry across the road quite a lot during my time at this college. Most of the time I've been too busy to actually go down and examine these bugs. Today, when I saw one of these critters crawling beside the road I decided to take a closer look.
What I found was the Dung Beetle in the process of rolling a ball of dung and trying to find a nest. Dung beetles are extremely cool creatures. From Wikipedia:
Here's a video from National Geographic that shows what the Dung beetles do.
What I found was the Dung Beetle in the process of rolling a ball of dung and trying to find a nest. Dung beetles are extremely cool creatures. From Wikipedia:
The "rollers" roll and bury a dung ball either for food storage or for making a brooding ball. In the latter case, two beetles, one male and one female, stay around the dung ball during the rolling process. Usually it is the male that rolls the ball, while the female hitch-hikes or simply follows behind. In some cases, the male and the female roll together. When a spot with soft soil is found, they stop and bury the ball, then mate underground. After the mating, both or one of them prepares the brooding ball. When the ball is finished, the female lays eggs inside it, a form of mass provisioning. Some species do not leave after this stage, but remain to safeguard their offspring. The dung beetle goes through a complete metamorphosis. The larvae live in brood balls made with dung prepared by their parents. During the larval stage, the beetle feeds on the dung surrounding it.
Here's a video from National Geographic that shows what the Dung beetles do.
Tuesday, 28 July 2015
Creating a Simple Bot using the Telegram Bot API
So Telegram recently released an API for creating bots on Telegram! It's quite cool. Already, people have released nice Python libraries for it. For this bot I'm using one of them.
I decided to create a simple Telegram bot that I can use to control the music on my laptop. It's a testament to the power and simplicity of Python and the convenience of the linux terminal that it only took me an hour to develop this. Head over to my repo at gihub if you want to look at the code!
Here's a video of it working!
I decided to create a simple Telegram bot that I can use to control the music on my laptop. It's a testament to the power and simplicity of Python and the convenience of the linux terminal that it only took me an hour to develop this. Head over to my repo at gihub if you want to look at the code!
Here's a video of it working!
Movie Review - Whiplash
Note: This post may contain spoilers.
I used to watch sci-fi and animated films exclusively but now every once in a while I watch a random movie just for the fun of it. Yesterday that random movie turned out to be Whiplash - a movie about an aspiring drummer and his well intentioned but abusive mentor.
I loved the movie but I disagree with the opinions of Terence Fletcher. There is a scene in the movie where he is asked if there is a line. If there is a limit which if he crosses he might actually discourage his student from becoming the next great musician that the world loves. And he replies that there is no such limit because the next great won't be discouraged. While that's a cute sentiment, I think he's wrong. The world isn't divided into 'The Greats' and talentless people. There are plenty of people who fall in between the spectrum. And I think that by not caring about those who are not destined to be 'Great' and even actively doing things that could damage them mentally he is doing more harm than good. Additionally, I think that there are a lot of people who respond much better to constructive criticism and positive reinforcement than to abuse hurled at them. For every person with a story about an abusive mentor who made them who they are, I think there are a few dozen people with stories about how their incredibly supportive and positive mentor inspired them to be their best.
So while the story was inspiring, I really hope that people don't use Terence as an inspiration.
Saturday, 18 July 2015
Maker Faire Singapore
The
day before yesterday I found out that there was a Maker Faire going on
in Singapore this weekend! I immediately scrapped all my other plans for
the weekend to get there. It was definitely worth it. I owe a lot to
the maker movement. It's what got me interested in engineering and I
think it's a big part of the reason I am the person I am today!
The
highlight of the Maker Faire was definitely the whole hall dedicated to
Intel Edison projects! There were so many cool things in that hall! I
have to admit that I only had a vague idea of what the Intel Edison did
before saw all those demos. It's a pretty capable SoC. My favorite demo
were these dancing hexapods. They were really cool to watch. The second
best demo was the robot with omni directional wheels.
The Dancing Hexapods |
There was also quite an impressive display of 3D printing at the Faire.
There were so many exhibits related to them! Apart from the conventional
designs using steppers to move the extruder in the X-Y plane I also saw
a 3D printer based on a delta robot.
Saw this 3D printer that could print huge parts |
The delta robot 3D printer |
Why Self-Driving Cars are the Future of Personal Transport
So Google's self driving cars are getting really good at self driving. A lot of people I know are skeptical about self driving cars. Some say that they will never use them because they just love to drive. I think that once self driving cars are actually on the road, they won't have a choice.
From everything the the testing of the self driving cars have shown so far, they seem to be much safer than human drivers. They don't get tired and they never break the rules. Human drivers on the other hand, cause millions of deaths every year due to carelessness and driving while sleepy, drunk or under the influence of drugs. Even if self driving cars only cut down the number of car accidents by half we have a moral obligation to switch over. By making the choice to continue driving manually we are making the choice for millions of deaths to continue happening every year.
I think that sometime in the distant future people will talk with horror about the time that people were allowed to drive their own cars. Just like we now talk with horror about the horrifying state of medical practices before we knew about germs and how exactly the human body worked.
From everything the the testing of the self driving cars have shown so far, they seem to be much safer than human drivers. They don't get tired and they never break the rules. Human drivers on the other hand, cause millions of deaths every year due to carelessness and driving while sleepy, drunk or under the influence of drugs. Even if self driving cars only cut down the number of car accidents by half we have a moral obligation to switch over. By making the choice to continue driving manually we are making the choice for millions of deaths to continue happening every year.
I think that sometime in the distant future people will talk with horror about the time that people were allowed to drive their own cars. Just like we now talk with horror about the horrifying state of medical practices before we knew about germs and how exactly the human body worked.
Friday, 3 July 2015
Terminator Genisys – Review
I watched the movie a few days ago. And I thought I’d write down some thoughts I had about the movie.
First of all, I have to admit that although I roughly know what happens in the earlier movies I have only watched one of the previous three movies.So this is the first Terminator film the I’ve watched fully.
For me, the movie was just OK. Nothing spectacular. Visual effects were nice and that was probably the best part of the movie.
Apart from the fact that the movie uses a completely overused plot of using time travel to reset what happened in the earlier movies, my main complaint about the film is the portrayal of the new enhanced symbiote John Connor as evil. By merging with the machines, John Connor has finally solved two of humanity’s greatest problems: aging and disease! Why does the movie portray that as bad?
In fact, one of the most most interesting thing researchers in medicine are working on right now is using nanobots for targeted drug delivery. Imagine having an army of nanobots inside you making sure you don’t age and never get any disease.
If everyone were to merge with the machines like John Connor did there would actually be peace again and human civilization would have taken the next logical step of using their now immortal bodies to explore the vastness of interstellar space!
For once the AI in the movie has a goal to work together with humans and they just had to portray that as evil. I’d love to see a Terminator film where the humans and the AI (it’s supposed to be super intelligent!) finally realize that the best thing to do is cooperate.
First of all, I have to admit that although I roughly know what happens in the earlier movies I have only watched one of the previous three movies.So this is the first Terminator film the I’ve watched fully.
For me, the movie was just OK. Nothing spectacular. Visual effects were nice and that was probably the best part of the movie.
Apart from the fact that the movie uses a completely overused plot of using time travel to reset what happened in the earlier movies, my main complaint about the film is the portrayal of the new enhanced symbiote John Connor as evil. By merging with the machines, John Connor has finally solved two of humanity’s greatest problems: aging and disease! Why does the movie portray that as bad?
In fact, one of the most most interesting thing researchers in medicine are working on right now is using nanobots for targeted drug delivery. Imagine having an army of nanobots inside you making sure you don’t age and never get any disease.
If everyone were to merge with the machines like John Connor did there would actually be peace again and human civilization would have taken the next logical step of using their now immortal bodies to explore the vastness of interstellar space!
For once the AI in the movie has a goal to work together with humans and they just had to portray that as evil. I’d love to see a Terminator film where the humans and the AI (it’s supposed to be super intelligent!) finally realize that the best thing to do is cooperate.
Monday, 22 June 2015
Another Weekend in Singapore
Now I’m working for a lab I guess I have a good idea of what working a 9 to 5 job will be like. I barely have time to do anything on weekdays. Free time has become a very precious resource. So weekends are the only time I get to actually do stuff. And most of the time, after a busy week all I feel like doing on the weekend is relax.
I’ve started going to the movies almost every week now. I didn’t have an opportunity to do that in Trichy. The theaters there are horrible. They tend not to show the kind of movies that I want to watch and when they do they either run it at a really inconvenient time or they dub it in Tamil. This place is heaven compared to Trichy. There are so many awesome places to see and visit! And so many good theaters!
Last Saturday, I decided to go visit the library. I felt like sitting in a quiet place and relaxing by reading something. I hadn’t done that in a really long time. I browsed around till I found a book that caught my eye. It was “Contact” by Carl Sagan. And I just sat down there and read for the next 4 hours. I really missed doing that. I’d do that a lot when I was a little kid. But after starting college I didn’t have much time to do that.
I went back to the library again on Sunday. This time I thought I’d do some reading on machine learning. There were these tables with power sockets on them set aside for people who wanted to do some work in the silence of the library. It was nice to sit there and work. The girl sitting beside me was also studying machine learning apparently. She had this book called “Supervised learning With Complex Valued Neural Networks” on her desk.
I think I’ll visit the library at least once a week from now on. It was nice to take a break from the general noise of the city.
I took a picture of the library building! It looks pretty cool!
I’ve started going to the movies almost every week now. I didn’t have an opportunity to do that in Trichy. The theaters there are horrible. They tend not to show the kind of movies that I want to watch and when they do they either run it at a really inconvenient time or they dub it in Tamil. This place is heaven compared to Trichy. There are so many awesome places to see and visit! And so many good theaters!
Last Saturday, I decided to go visit the library. I felt like sitting in a quiet place and relaxing by reading something. I hadn’t done that in a really long time. I browsed around till I found a book that caught my eye. It was “Contact” by Carl Sagan. And I just sat down there and read for the next 4 hours. I really missed doing that. I’d do that a lot when I was a little kid. But after starting college I didn’t have much time to do that.
I went back to the library again on Sunday. This time I thought I’d do some reading on machine learning. There were these tables with power sockets on them set aside for people who wanted to do some work in the silence of the library. It was nice to sit there and work. The girl sitting beside me was also studying machine learning apparently. She had this book called “Supervised learning With Complex Valued Neural Networks” on her desk.
I think I’ll visit the library at least once a week from now on. It was nice to take a break from the general noise of the city.
I took a picture of the library building! It looks pretty cool!
Saturday, 20 June 2015
Using IFTTT for blog syncing.
So since I decided to keep both my blogs running in parallel, I’ve only posted two blogposts and I’m already starting to find it annoying to update both blogs every time I get a new idea for a post. So I decided to try out this thing that I signed up for a long time ago, but never really used till now. IFTTT (IF This Then That). It’s a web service that allows you to automate your social media accounts by having certain actions triggered automatically when you do something. For example, you can set it up so that whenever you like a video on youtube, it’s automatically tweeted or shared on Facebook or posted on your blog. It’s actually quite brilliant!
Now I signed up for this service when it was just starting out because I thought it was a really cool idea. But at that time I didn’t really have much going on in the web. So even though I had an account, I never really used a single recipe till now. Now I’ve finally created one so that whenever there’s a new post in my wordpress blog on my website, the same thing is automatically posted to my blogger blog as well.
Friday, 19 June 2015
Analysing sound in Python
I'm
trying to build a simple word recognition system in python. As a first
step, I needed to find a way to get audio sample data from my microphone
and store it in a numpy array in Python. After a lot of searching and
experimenting I finally found a library that works well for this task: pyalsaaudio.
This small piece of code records roughly two seconds of audio from the default microphone and plots the spectrogram. This was actually a bit tricky to figure out so I thought I'd share the code for anyone out there who might be trying to do this.
#!/usr/bin/python import struct import alsaaudio as aa import numpy as np import time import matplotlib.pyplot as plt from pylab import * SAMPLERATE = 8000 PERIODSIZE = 160 CHANNELS = 1 CARD = 'default' inp = aa.PCM(aa.PCM_CAPTURE, aa.PCM_NONBLOCK, CARD) inp.setchannels(1) inp.setrate(SAMPLERATE) inp.setformat(aa.PCM_FORMAT_S16_LE) inp.setperiodsize(PERIODSIZE) sound = np.array([0]) if __name__=='__main__': ctr = 20000 while ctr > 0: ctr -= 1 l,data = inp.read() if l: samples = struct.unpack('h'*l, data); sound = np.append(sound, np.array(samples)) time.sleep(0.0001) print(sound.size) #plt.plot(sound) #plt.ylabel('amplitude') #plt.xlabel('time') #show() Pxx, freqs, bins, im = specgram(sound, NFFT=1024, Fs=8000, noverlap=900, cmap=cm.gist_heat) show()
Tuesday, 16 June 2015
I have a new website!
So it's summer holidays again! This time I'm spending my holiday as a research intern at NUS in Singapore. I love the labs here! They're absolutely amazing. I got an opportunity to see one of those huge KUKA robotic arms up close. Me and my friends are working on a four legged robot that uses a lot of compliance and underactuation. It's been quite a fun experience. Singapore is a really nice place. I had a little bit of trouble adjusting initially because the rooms we were staying in were quite small. Then I realized that that's a consequence of living in a big congested city like Singapore. Real estate is expensive and cheap accommodation will be small in size.
Moving on, I've been thinking of setting up a website for myself for quite some time. But I never had a big enough block of free time to set it up. So I finally decided to do it during these holidays. It took me almost a week to tweak it to my satisfaction and even then there's a lot of room for improvement. But I think it's finally time to release my website to the internet! It's live at www.ashwinnarayan.com! I've documented a lot of the projects that I've done over the last few years there.
I'm having a little bit of trouble deciding what to do with my Blogger blog. My new website has a blog and I've already imported most of the posts to it. I'm slowly publishing the older posts after fixing some formatting issues. But I'm not sure that I want to leave this blog either. I've been writing in here for years and years. I think I'll keep both the blogs alive and post in both of them. Maybe I will use the blog on my website to post exclusively about technical stuff. Or maybe I'll post on both blogs in parallel for a while.
Monday, 2 February 2015
VGA Output from an FPGA
So after the simple 8 bit counter on the fpga I decided to get started with generating VGA output from my FPGA. Because that seemed to be a reasonably simple yet really cool looking project. Initially I thought I'd go the cheap way and solder up my own VGA circuit. But I just ended up wasting a lot of time and effort to get a really substandard circuit that would die on me all the time. So I got myself one of these VGA modules from the Numato Labs website.
First a bit about VGA signals. VGA was originally intended to work with CRT monitors. So the standard was built around the working of a CRT monitor. These type of monitors use a Cathode Ray Tube(CRT). Inside every CRT is an electron gun that produces a beam of really fast electrons. Electric fields are used to deflect this beam so that it can fall on different positions on a screen. The screen has dots(pixels) with phosphors that glow when hit by electrons and when all of these dots glow.
CRT monitors build up an image by scanning this electron beam horizontally across the screen in one line, then moving the beam a short distance downward and scanning again and again. The image is built up line by line.
So a VGA signal has two clocks called Vertical Sync (VS) and Horizontal Sync (HS). The HS pulses every time the screen starts scanning a new line. And the VS pulses every time the whole screen is scanned once. For more details on VGA signals, visit this page.
The R, G and B, signals have nominal values of 0.7 volts. So 0 volts on the R line means no Red component and 0.7 volts means maximum red component. The inputs have impedances of 75 ohms.
I took a look at the schematic of the VGA module.
You'll notice that there are three pins connecting to the red and green input signal and 2 connecting to the blue input signal. The pins are connected through resistors whose values rise exponentially. This is a simple way to perform DAC. I decided to work through the equations to see what input voltage levels were available to me.
For the Red and Green channels, I took the value of the smallest resistor as $R$ and the input voltage as V and went from there.
$\frac{V-V_1}{4R} + \frac{V-V_2}{2R} + \frac{V-V_3}{R} + \frac{V}{R_L} = 0$ (Using KCL)
Rearranging in terms of $V$ we get:
$V = \frac{V_{cc}}{14 + \frac{4R}{R_L}} (2b_0 + 4b_1 + 8b_3)$ where I set $V_i = V_{CC} b_i$
So after substituting all the numerical values. $(V_{CC} = 3.3V, R_L = 75 \varOmega R = 500 \varOmega)$ I get:
$V = 0.09802(b_0 + 2b_1 + 4b_2)$ where $b_2 b_1 b_0$ is a 3 digit binary number. This allows me to have 8 different shades of red! Following a similar procedure I can get expressions for the other colors. And in total I need 8 bits (3 for red, 3 for green, 2 for blue (poor blue :( (I like using nested parantheses. (I should probably stop now.)))) to represent the color of one pixel. So I have a palette of 256 colors to work with.
Now it's time to get started with generating the VGA sync signals.
I'm learning from this textbook called FPGA Prototyping by Verilog Examples by P. Chu. It has a brilliant section on generating VGA output. However, I couldn't just copy the code in there and check if the circuit was working. I have a small external monitor which I use for testing purposes. It turns out that this monitor (because it's a very cheap monitor) only supports one video mode. 1368 x 768 @ 60 Hz refresh rate. Nothing else works. Following the signal timing information about this resolution from the tinyVGA website, I modified the code (actually I typed it out line by line on my own so I could understand what was going on in each line. I highly recommend this technique when you're learning a new language.)
It took quite a while for me to get the output working properly. First I tried using the ipcore wizard to generate a pixel clock that was exactly the same as the recommended one. But when I tried to get the ISE to compile the code, it complained about timing issues. So I used the ipcore wizard to generate a clock that was exactly twice that of the pixel clock and divided this clock by 2 inside the VGA synchronization module.
Here's a video of the working VGA output:
Now I think I'll get started on generating something interesting on the fpga. Maybe some fractals?
First a bit about VGA signals. VGA was originally intended to work with CRT monitors. So the standard was built around the working of a CRT monitor. These type of monitors use a Cathode Ray Tube(CRT). Inside every CRT is an electron gun that produces a beam of really fast electrons. Electric fields are used to deflect this beam so that it can fall on different positions on a screen. The screen has dots(pixels) with phosphors that glow when hit by electrons and when all of these dots glow.
CRT monitors build up an image by scanning this electron beam horizontally across the screen in one line, then moving the beam a short distance downward and scanning again and again. The image is built up line by line.
So a VGA signal has two clocks called Vertical Sync (VS) and Horizontal Sync (HS). The HS pulses every time the screen starts scanning a new line. And the VS pulses every time the whole screen is scanned once. For more details on VGA signals, visit this page.
The R, G and B, signals have nominal values of 0.7 volts. So 0 volts on the R line means no Red component and 0.7 volts means maximum red component. The inputs have impedances of 75 ohms.
I took a look at the schematic of the VGA module.
You'll notice that there are three pins connecting to the red and green input signal and 2 connecting to the blue input signal. The pins are connected through resistors whose values rise exponentially. This is a simple way to perform DAC. I decided to work through the equations to see what input voltage levels were available to me.
For the Red and Green channels, I took the value of the smallest resistor as $R$ and the input voltage as V and went from there.
$\frac{V-V_1}{4R} + \frac{V-V_2}{2R} + \frac{V-V_3}{R} + \frac{V}{R_L} = 0$ (Using KCL)
Rearranging in terms of $V$ we get:
$V = \frac{V_{cc}}{14 + \frac{4R}{R_L}} (2b_0 + 4b_1 + 8b_3)$ where I set $V_i = V_{CC} b_i$
So after substituting all the numerical values. $(V_{CC} = 3.3V, R_L = 75 \varOmega R = 500 \varOmega)$ I get:
$V = 0.09802(b_0 + 2b_1 + 4b_2)$ where $b_2 b_1 b_0$ is a 3 digit binary number. This allows me to have 8 different shades of red! Following a similar procedure I can get expressions for the other colors. And in total I need 8 bits (3 for red, 3 for green, 2 for blue (poor blue :( (I like using nested parantheses. (I should probably stop now.)))) to represent the color of one pixel. So I have a palette of 256 colors to work with.
Now it's time to get started with generating the VGA sync signals.
I'm learning from this textbook called FPGA Prototyping by Verilog Examples by P. Chu. It has a brilliant section on generating VGA output. However, I couldn't just copy the code in there and check if the circuit was working. I have a small external monitor which I use for testing purposes. It turns out that this monitor (because it's a very cheap monitor) only supports one video mode. 1368 x 768 @ 60 Hz refresh rate. Nothing else works. Following the signal timing information about this resolution from the tinyVGA website, I modified the code (actually I typed it out line by line on my own so I could understand what was going on in each line. I highly recommend this technique when you're learning a new language.)
It took quite a while for me to get the output working properly. First I tried using the ipcore wizard to generate a pixel clock that was exactly the same as the recommended one. But when I tried to get the ISE to compile the code, it complained about timing issues. So I used the ipcore wizard to generate a clock that was exactly twice that of the pixel clock and divided this clock by 2 inside the VGA synchronization module.
Here's a video of the working VGA output:
Now I think I'll get started on generating something interesting on the fpga. Maybe some fractals?
Monday, 26 January 2015
i3 Window Manager - After 7 months.
So I've been using the i3 window manager for quite a while now. And once in a while I miss some of the functionality that certain programs in Ubuntu provided. Especially some of the functionality of the network settings tool. So just for one evening I decided to switch over to the Gnome Desktop Environment.
I had to switch over to i3 within a few hours because after the wonderfully customizable and workflow friendly key combinations that I had set up in i3 fiddling about with the mouse on the desktop environment felt really clunky and frustrating. It took me more than twice as much time to do a simple task like open a text editor and start writing some C code.
Rich desktop environments like Gnome and Unity are good if you're a beginner or if you don't plan on using your computer for anything other than web browsing. But if you're into programming or doing anything even remotely technical with your computer, keyboard friendly tiling window managers like i3 will definitely increase your productivity.
I had to switch over to i3 within a few hours because after the wonderfully customizable and workflow friendly key combinations that I had set up in i3 fiddling about with the mouse on the desktop environment felt really clunky and frustrating. It took me more than twice as much time to do a simple task like open a text editor and start writing some C code.
Rich desktop environments like Gnome and Unity are good if you're a beginner or if you don't plan on using your computer for anything other than web browsing. But if you're into programming or doing anything even remotely technical with your computer, keyboard friendly tiling window managers like i3 will definitely increase your productivity.
Monday, 5 January 2015
Setting Up a Cool Dr. Who Wallpaper That Changes Every Hour on Linux
I came across this really cool idea by someone on reddit to take this awesome wallpaper of the 12 regenerations of the doctor and highlight each Doctor by the hour. This is easy enough to do on Windows which has wallpapers which can change every minutes. Being a user of linux, I was interested in trying to get a similar thing to work on linux.
First here is the set of all the wallpapers.
https://imgur.com/a/av5TP
Download the wallpapers and put it all in a folder somewhere on your system. Say you put it in /home/ash/Pictures/whopaper/
Rename each picture so that the filename corresponds the the regeneration number. The first doctor's highlighted picture should be named 01.jpg and the second doctor's should be named 02.jpg and so on till 12.jpg.
Then set up a cronjob that is executed every hour that checks the hour on the clock and sets the background to that. For doing this I installed a program called "feh" (sudo apt-get install feh) which allows me to set my desktop wallpaper from the terminal. I also installed a program called gnome-schedule (sudo apt-get install gnome-schedule) which allows me to easily edit the crontab file.
Run gnome-schedule and click on "New" to add a new cronjob. Select "A task that launches recurrently". Add a short description (like "Cool Doctor Who Wallpaper" and in the command field enter the following command:
env DISPLAY=:0 /usr/bin/feh --bg-scale "/home/rationalash/Pictures/Wallpapers/DRWHO/$(date +\%I).jpg">>sys.log
Replace the path to the wallpaper in the command to the folder in which you saved the wallpapers.
Under the "Time & Date" section select "Basic" and set the task to run every hour.
First here is the set of all the wallpapers.
https://imgur.com/a/av5TP
Download the wallpapers and put it all in a folder somewhere on your system. Say you put it in /home/ash/Pictures/whopaper/
Rename each picture so that the filename corresponds the the regeneration number. The first doctor's highlighted picture should be named 01.jpg and the second doctor's should be named 02.jpg and so on till 12.jpg.
Then set up a cronjob that is executed every hour that checks the hour on the clock and sets the background to that. For doing this I installed a program called "feh" (sudo apt-get install feh) which allows me to set my desktop wallpaper from the terminal. I also installed a program called gnome-schedule (sudo apt-get install gnome-schedule) which allows me to easily edit the crontab file.
Run gnome-schedule and click on "New" to add a new cronjob. Select "A task that launches recurrently". Add a short description (like "Cool Doctor Who Wallpaper" and in the command field enter the following command:
env DISPLAY=:0 /usr/bin/feh --bg-scale "/home/rationalash/Pictures/Wallpapers/DRWHO/$(date +\%I).jpg">>sys.log
Replace the path to the wallpaper in the command to the folder in which you saved the wallpapers.
Under the "Time & Date" section select "Basic" and set the task to run every hour.
And that's it! Now if you don't keep your computer on all the time the wall paper might be wrong for some time because the cronjob only runs every hour. If you want to be sure that the wallpaper is always the same as the hour on the clock even if your turn your computer on a lot edit the frequency of the cronjob so that it runs more often. (Say once a minute). The wrong wallpaper won't be there for longer than a minute.
Subscribe to:
Posts (Atom)