Jean-Baptiste Onofré: Some books review: Instant Apache Camel Messaging System,Learning Apache Karaf, and Instant Apache ServiceMix How-To
I’m pleased to be reviewer on new books published by Packt:
I received a “hard” copy from Packt (thanks for that), and I’m now able to do the review.Instant Apache Camel Messaging System, by Evgeniy Sharapov. Published by Packt publishing in September 2013
This book is a good introduction to Camel. It covers Camel fundamentals.What is Apache Camel
It’s a quick introduction about Camel, in only four pages. We have a good overview about Camel basics: what is a component, routes, contexts, EIPs, etc.
We have to see that as it is: it’s just a quick introduction. Don’t expect a lot of details about the Camel basics, it just provides a very high level overview.Installation
To be honest, I don’t like this part. It focus mostly on using Maven with Camel: how to use Camel with Maven, integrate Camel in your IDE (Eclipse, or IntelliJ), usage of the archetypes.
I think it’s too much restrictive. I would have prefered a quick listing of the differents ways to install and use Camel: in a Karaf/ServiceMix container, in a Spring application context, in Tomcat or another application server, etc.
I’m afraid that some users will take “bad habits” reading this part.Quickstart
This part goes in bit deeper about CamelContext and RouteBuilder. It’s a good chapter, but I would have focus a bit more about the DSL (at least Java, Spring, and Blueprint).
The example used is interesting as it uses different components, transformation, predicates and expressions.
It’s a really good introduction.Conclusion
It’s a good introduction book, only for new Camel users. If you already know Camel, I’m afraid that you will be a disapointed and you won’t learn a lot.
If you are a Camel rookie rider, and you want to move forward quickly, with a “ready to use example”, this book is good one.
I would have expects more details on some key Camel features, especially the EIPs, and some real use cases on EIP with some components.Learning Apache Karaf, by Jamie Goodyear, Johan Edstrom, Heath Kesler. Published by Packt publishing in October 2013
I helped a lot on this book and I would like to congratulate my friends Jamie Goodyear, Johan Edstrom, Heath Kesler. You did a great job guys !
It’s the perfect book to start with Apache Karaf. All Karaf features are introduced, and more, like Karaf Cellar.
It’s based on Karaf 2.x (an update will be required for Karaf 3.0.0 as a lot of commands, etc changed).
The global content is great for beginner. If you already know Karaf, you probably know most of the content, however, the book can be helpful to discover some features like Cellar.
Good job guys !Instant Apache ServiceMix How-To, by Henryk Konsek. Published by Packt publishing in June 2013
This book is a good complement from the Camel and Karaf ones. Unfortunately, some chapters are a bit redondent: you will find the same information in both books.
However, as Apache ServiceMix is powered by Karaf, starting from Learning Apache Karaf makes sense and give you details about the core of ServiceMix (the “ServiceMix Kernel”, which is the genesis of Karaf ).
This book is a good jump to ServiceMix.
I would have expect some details about some ServiceMix NMR (naming for instance), the different distributions.
ServiceMix is more than an umbrella project gathering Karaf, Camel, CXF, ActiveMQ, etc. It also provides some interesting features like Naming, etc. It would have been great to introduce this.Conclusion
These three books are great for beginners, especially the Karaf one.
I was really glad and pleased to review these books. It’s a really a tough job to write this kind of books, and we have to congratulate the authors for their job.
It’s a great work guys !
Its been an amazing ride, its hard to believe its been just a year! A huge thanks to the hawtio team and everyone who's helped turn hawtio into a truly amazing console.
hawtio's really growing up fast and getting more awesome by the day. It constantly surprises me the awesomeness the hawtio team keep on adding.
Highlights of the yearThe highlights of the year for me are:
- lots and lots of plugins are now available for working with JVMs, JMX, logging and many frameworks like Apache Camel, Apache ActiveMQ, Infinispan, ElasticSearch and OSGi
- Apache ActiveMQ 5.9.x or later now ships with hawtio inside
- Apache Camel folks have effectively deprecated the old camel console in favour of hawtio
- JBoss A-MQ and JBoss Fuse 6.1 is coming with hawtio as the default Fuse Management Console
- hawtio works great stand alone or in most containers now like Apache Karaf, Apache Tomcat, Jetty and Widfly
- in Camel we can visualise real time visualisations of running Camel routes inside a JVM, see the metrics update in real time, visually design camel routes, and trace or debug running routes
- in ActiveMQ we can see all the queues, topics and metrics; create queues/topics, browse queues, on 5.9.x we can resend DLQ messages, move messages from a queue to another queue, delete messages, send messages and see destination consumer/producer diagrams. When using Fuse 6.1 we can visually design clustered broker topologies (e.g. for geographic store and forward networks).
- in OSGi there is support for all main aspects; from viewing bundles, features, Config Admin, declarative services, viewing services, packages, dependency graphs, diagnosing class loading issues, navigating from bundle to maven metadata to source/javadoc, to using the Karaf shell from a browser.
- when using JBoss Fuse 6.1 then hawtio becomes a full featured UI for working with many containers in a fabric; creating containers, editing profiles, looking inside runtimes, browsing logs etc.
Or being able to search maven repositories and view versions, source or javadoc from inside the browser. Also the interactive developer help is pretty cute; so you can play around with all of hawtio's angularjs directives in the browser ;)
Getting StartedI don't think you truly understand how awesome hawtio is until you start using it really. So get started today!
We love contributions so please dive in and help; even if its just ideas for how to make things even more awesome.
hawtio is built on AngularJS; I've used many different UI and web frameworks over the years (most of them TBH) and I really can't recommend AngularJS highly enough. So if you fancy learning AngularJS, why not try hacking a new plugin or adding some functionality to an existing plugin you like? There's lots of ideas already if you're not sure what to do.
Check out the developer guide for more details on getting started and building the code.
hawtio 1.2.0 released!To celebrate hawtio's first birthday we've just released 1.2.0 today! It should be sync'd to maven central in the next hour or two.
There are 407 issues fixed in this release (most of them new features or improvements I might add!)
So what are you waiting for? Go get it while its hawt!
Don't cha wish your console was hawt like hawtio?
This is "How Mr. Weasel was made an outcast", from the book Mother West Wind "How" Stories by Thornton W. Burgess. I haven't recorded something in a long time, but hopefully will do more of these in the weeks to come.
I’m trying to avoid doing this in order to avoid more power consumption and unpopular hardware in the house — but if necessary, this is a good up-to-date homebuild design
Interesting article around using mmap’d files from Java using RandomAccessFile.getChannel().map(), which allows them to be accessed directly as a ByteBuffer. together with Atomic variable lazySet() operations, this provides pretty excellent performance results on low-latency writes to disk. See also: http://psy-lob-saw.blogspot.ie/2012/12/atomiclazyset-is-performance-win-for.html
a realtime processing engine, built on a persistent queue and a set of workers. ‘The main goal is data availability and persistency. We created grape for those who cannot afford losing data’. It does this by allowing infinite expansion of the pending queue in Elliptics, their Dynamo-like horizontally-scaled storage backend.
‘remember, there is no axe murderer. probably’
MITM attacks via BGP route hijacking now relatively commonplace on the internet, with 60 cases observed so far this year by Renesys
Awesome and amusing parody of the trailer for Alfonso Cuarón’s Space. If you’ve ever been in an IKEA and seen Space, you’ll probably love this mashup.
Well, as of this moment it's almost set, since the final match (Uruguay - Jordan) has not yet been played.
But let's be bold, and assume that Jordan won't manage to overcome a 5 goal deficit on the road against Uruguay.
So we have determined the 32 qualifying teams.
Regionally, it breaks down as follows:
- Europe: 13 teams
- South America: 6 teams (assuming Uruguay qualifies)
- Africa: 5 teams
- North and Central America: 4 teams
- Asia: 4 teams
We now have 2 weeks until the group draws, on December 6th.
The final draw will be complicated. The top 8 teams will be based on the October Ranking, which means there will almost certainly be 4 teams from Europe and 4 teams from South America. As Wikipedia saysThe other pots will be based on geographic and sports criteria. These wacky criteria tend to be things like: Brazil and Argentina will be in groups that, assuming they both advance, won't meet until the semifinal. And Spain and Germany will be similarly distributed. And we'll distribute the teams from the various continents "evenly" across the groups, so that no group will contain more than 2 European teams, or more than 1 team from any other region.
Since it looks like the Netherlands will not be in the top 8, my pronouncement is: whichever group the Dutch are in is "the group of death". Of course, there are a few other extremely strong teams that are just outside the top 8, including: England, Chile, and Portugal.
Regardless, it looks like it will be a very strong lineup, overall, with many great teams. The top team which didn't qualify is the Ukraine, which shockingly lost to France yesterday and are out.
And, of course, as ESPN point out, there are some well known other teams that missed the boat: Sweden, Serbia, Turkey among others.
And, probably, many football fans will have only one question on their minds: will Leo Messi be healthy by next spring?
But I think it will be a fine selection of teams that make their way to Brazil next summer.
Start planning your viewing parties now! Where will you be on July 13, 2014?
[preamble: this is not me writing against collecting data analysing user behaviour, including Tv viewing actions. I cherish the fact that Netflix recommends different things to different family members, and I'm happy for the iPlayer team to get some generic use data and recognise that nobody actually wants to watch Graham Norton purely from the way that all viewers stop watching before the introductory credits are over. What is important here is that I get things in exchange: suggestions, content. What appears to be going on here is that a device I bought is sending details on TV watching activity so as to better place adverts on a a bit of the screen I paid for, possibly in future even interstitially during the startup of a service like Netflix or iPlayer. I don't appear to have got anything in exchange, and nobody asked me if I wanted the adverts let alone the collection of the details of myself and my family, including an 11 year old child.]
Just after Christmas I wandered down to Richer Sounds and bought a new TV, first one in a decade, probably second TV we've owned since the late 1980s. My goal was a large monitor with support for free to air DTV and HD DTV, along with the HDMI and RGB ports to plug in useful things, including a (new) PS3 which would run iPlayer and Netflix. I ended up getting a deeply discounted LG Smart TV as the "smart" bits came with the monitor that I wanted.
I covered the experience back in March, where I stated that I felt that smart bit was AOL-like in its collection of icons of things I didn't want and couldn't delete, it's dumbed down versions of Netflix and iPlayer, and its unwanted adverts in the corner. But that's it, the netflix tablet/TV integration compensates for the weak TV interface, and avoids the problem of PS3 access time limits on school nights, as the PS3 can stay hidden until weekends.
Only a few days later, Libby Miller pointed me at an article by DoctorBeet, who'd spun wireshark up to listen to what the TV was saying, and so showing how his LG TV is doing an HTTP forms POST to a remote site of every channel change, as well as details on filenames in USB sticks.
This is a pretty serious change on what a normal television does. DoctorBeet went further and looked at why. Primarily it appears to be for advert placement, including in that corner of the "smart" portal, or a start time after you select "premium" content like iPlayer or netflix. I haven't seen that which is good -an extra 1.5MB download for an advert I'd have to stare through is not something I'd have been happy with.
Anyway, go look at his article, or even a captured request.
I'm thinking of setting up wireshark to do the same for an evening. I made an attempt yesterday but as the TV is CAT-5 to a 1Gbs hub, then an ether over power bridge to get into the base station, it's harder than I'd thought. My entire wired network is on switched ports so I can't packet sniff, and the 100 MB/s hub I dredged up from the loft turned out to be switched too. That means I'd have to do something innovative like use the WEP-only 802.11b ether to wifi bridge I also found in that box, hooked up to an open wifi base station plugged into the real router. Maybe at the weekend. A couple of days logs would actually be an interesting dataset even if it just logs PS3 activity hours as time-on-HDMI-port-1
What I did do is go to the "opt out of adverts" settings page DoctorBeet had found, scrolled down and eventually followed some legal info link to get back to the privacy settings. Which I did photo this time, and which are now up on Flickr.
Some key points of this policy
Information considered to be non personally identifiable include MAC addresses and "information about the live content you are watching"
That's an interesting concept, which I will get back to. for now. note that that specific phrase is not indexed anywhere into BigTable, implying it is not published anywhere that google can index it.
Or "until you sit through every page with a camera this policy doesn't get out much"
If you have issues, don't use the television
That's at least consistent with customer support.
Anyway. there's a lot more slides. One of them gives a contact, who when you tap in to LinkedIn not only shows that he's the head of legal at LGE UK, that he's one hop away from me: datamining in action.
Now, returning to a key point: Is TV channel data Non-personal information?
Alternatively: If I had the TV viewing data of a large proportion of a country, how would I deanonymize it?
The answer there is straightforward, I'd use the work of [2004 Arvind Narayanan and Vitaly Shmatikov], Robust De-anonymization of Large Sparse Datasets.
In that seminal paper, Narayanan and Shmatikov took the anonymized Netflix dataset of (viewers->(movies, rankings)+), and deanonymized it by comparing film reviews on Netflix with IMDb reviews, looking for reviews that appeared on IMDb shortly after a Netflix review with ratings matching/close to that a Netflix review. They then took the sequence of a viewers' watched movies and looked to see if a large set of their Netflix review met that match critera. At the end of which they managed to deanonymize some Netflix viewers -correlating them with an IMDb reviewer may standard deviations out from from any other candidate. They could then use this match to identify those movies which the viewer had seen and yet not reviewed on IMDb.
The authors had some advantages, both netflix and IMDb had reviews, albeit on a different scale. the TV details don't so the process would be more ad-hoc
- Discard all events that aren't movies
- Assume that anything where the user comes in late to some threshold isn't a significant "watch event" and discard.
- Assume that anything where the user watches all the way to the end is a significant "watch event" and may be reviewed later.
- Assume that watching events where the viewer changes channel some distance into a movie -say 20 min- as a significant watch failure event, which may be reviewed negatively.
- Consider watch events where the user was on the same channel for some time before the movie began as less significant than when they tuned in early.
- If information is collected when a user explicitly records a movie, a "recording event", that is treated even more significantly.
- Go through the IMDb data looking for any reviews appearing a short time after a significant set of watch events, expecting higher ratings from significant watch events and recording events, and potentially low ratings from a significant watch failure.
I don't know how many matches you'd get here -as the paper shows, it's the real outliers you find, especially the watchers of obscure content.
Even so, the fact that it is would to possible to identify at least one viewer this way shows that TV watching data is personal information. And I'm confident that it can be done, based on the maths and the specific example in the Robust De-anonymization of Large Sparse Datasets paper.
Conclusion: irrespective of the cookie debate, TV watching data may be personal -so the entire dataset of individual users must be treated this way, with all the restrictions on EU use of personal data, and the rights of those of us with a television.
I love FiftyThree’s Paper application and have been trying a variety of styli to use with it, but it looks like FiftyThree themselves may have made the killer one—especially for anybody who misses big fat chunky pencils.
Plus, even if you’re not interested in a stylus for your iPad, the webpage for the product is downright gorgeous. Worth a look.
Shawn Blanc has launched his new joint: The Sweet Setup.
We all use our iPhones, iPads, and Macs every day. We use them to communicate with friends and family, take pictures and videos of special events and moments, do our job, and more…There are so many fantastic apps and other tools to help us with all these tasks. And that’s why I built this site. Because I want to use the best tools for whatever the task is and I bet you do, too.
Among the initial entrants: Byword and Day One, two of my favorite applications. I’m totally looking forward to seeing Shawn and his collaborator’s work on this. I have a sneaking suspicion I know of at least one article that’s coming in December.
Following the arrival of the longer screws, all 4 motors were quickly attached and their directions checked. Only one was incorrect and needed the wires swapping. Despite the colours of the arms clearly showing direction I also went with green propellers at the front and black at the back (as per the first quad) to give additional indications.
After a quick tweak of the PI values and a zeroing of the receiver inputs it was time to see how it flew.
The answer was surprisingly well. Compared to the earlier design there was far less yaw evident when lifting off and the extra indicators of direction made figuring out corrections easier. The shorter legs means it sits closer to the ground which appears to make it slightly less stable just as it lifts, but that was easily corrected. The main issue I ran across was some of the receiver leads came loose and didn’t seem to be seated as well as I’d have liked, constantly becoming disconnected. As I have spare cables it’s an easy fix.
As it was hovering around freezing when I was trying, I didn’t stay out long. The early signs are positive. Now if the gale force winds can just abate to allow me to tune the PI settings…
임대 사업으로 돈을 벌어야 하는 인프라 프로바이더가 "클라우드는 다양한 장점을 갖고 있는 최첨단 테크널러지임에도 불구하고 비용은 오히려 저렴하다"는 뉘앙스의 마케팅에서부터 잘 이해되지 않는다는 것.
빅데이터도 요즘 가만 보면 대단히 유사하게 돌아가는데, 그게 바로 SQL on Hadoop 이다. 기존 DW 시장을 노리면서 "최첨단 빅데이터 테크널러지"임에도 불구하고 비용은 저렴하다는 식으로 접근하고 있지. 애당초의 빅데이터 분석에 문제는 살포시 데이터 사이언티스트의 책임으로 돌리면 빅데이터 패키지는 완성된다.
그리하여 무슨 통신사니 제조회사니 빅데이터 기술 도입하니 비용은 절감되고 성능이 몇 배 빨라졌다는 둥 뉴스를 보자하면 우리는 무슨 생각을 갖어야할까?
언론의 무서움을 새삼 느낀다.
요즘 귀에 따갑게 듣고 있는 Hive니 Impala니 은연중 우리네 귀로 주입되는 편협한 정보의 내면에는 정보처리 기술의 최첨단 혁신만이 담겨있는 것만은 아니라는걸 말하고 싶다.
... though you probably couldn't tell a lot of difference from those previous Windows 8 posts.
As 3.7 gigabyte downloads go, this one seems to have been trouble free.
Steven J. Murdoch presents some interesting results indicating that the EURion constellation may have been obsoleted:Recent printers, scanners and image manipulation software identify images of currency, will not process the image and display an error message linking to www.rulesforuse.org. The detection algorithm is not disclosed, however it is possible to test sample images as to whether they are identified as currency. This webpage shows an initial analysis of the algorithm’s properties, based on results from the automated generation and testing of images. [...] Initially it was thought that the “Eurion constellation” was used to identify banknotes in the newly deployed software based system, since this has been confirmed to be the technique used by colour photocopiers, and was both necessary and sufficient to prevent an item being duplicated using the photocopier tested. However further investigation showed that the detection performed by software is different from the system used in colour photocopiers, and the Eurion constellation is neither necessary nor sufficent, and in fact it probably is not even a factor.
good Redshift tips
A rather sordid tale of IP acquisition and exploitation, from the sounds of it
Out of the box, Fedora 19 doesn’t have support for the broadcom wifi chip in the MacBook Pro 15″ Retina. There are quite a few complex instructions for adjusting firmware and compiling bits and bobs etc, but the easiest way to get it up and running on Fedora is using rpmfusion.
You can do it by downloading a bunch of rpms and stuffing around with USB drives, but its way easier if you setup network access first via either a thunderbolt ethernet adapter (make sure its plugged in before starting up as hotplugging thunderbolt doesn’t work under Linux), or via bluetooth. The bluetooth connection can either be to a mobile phone sharing its data connection or if you have another Mac around, it can share its wifi network over bluetooth (turn on Internet Sharing in the Sharing settings panel).
Once you have network access, run a yum update so you have the latest packages from fedora. It didn’t work for me with the plain Fedora 19 install.
Then go to rpmfusion.org and install first the “RPM Fusion free for Fedora 19″ then the “RPM Fusion nonfree for Fedora 19″ RPMs.
Finally, run ‘sudo yum install broadcom-wl’. After a reboot Linux should come back up with wifi working.
HackTX is the biggest hackathon in Texas. It’s a 24 hour annual hackathon hosted by the Hacker Lounge and Technology Entrepreneurship Society student organizations at The University of Texas at Austin. It’s made up of 500 hackers with $10,000 in prizes. By far, it’s the biggest hackathon that I’ve personally attended and I was pretty damn excited to represent Rackspace as a sponsor.Kick Off
To kick things off, some of the sponsors gave a demo of their APIs (plus associated software development kits) and the kinds of things you can do with their services. I had a chance to demonstrate what the Rackspace Cloud could do for these hackers. I showed them how to use the Developer Discount to sign up for a Rackspace account that they could use during the hackathon and beyond without having to pay for anything. They could use that account to spin up a Performance 1 Cloud Server quickly that would be able to host whatever project they were hacking on. Finally I showed them how to use the Rackspace SDKs to make it easy for them to access other cloud APIs in the programming language of their choice. Knowing that Java was part of the curriculum at UT Austin, I focused on how to use the Rackspace Java SDK, powered by Apache jclouds.
Here’s a short video of the demo I did.The Hackers
It was great to be surrounded by so many developers (mostly students) all geared up and ready to go.
There was no shortage of passion for coding and getting something cool built. When it came time to go heads down and really start hacking away, I happened to be sitting by a group that was constantly whistling. Sitting there in front of their laptops, coding away, and whistling at their screens. I didn’t think too much of it at the time except for occasionally cringing at an off key whistle.Coda
It turns out I was sitting by the eventual winners of HackTX! They were building a web application that was “Guitar Hero for whistling” and aptly named Whistle Hero. Give it a try, it’s riot. I can’t pull off the Kill Bill Whistle Song but I can do a pretty good Twinkle Twinkle Little Star. Second place went to Relevant xkcd, which is always useful, and third went to an app called Alert Meet.
Congrats to all of the hackers who participated. It was a great hackathon and we hope to see you next year!
I’ve been in Amsterdam two days thus far in preparation for the CloudStack Collaboration Conference. I had planned to come in and show up at the the Schuberg Philis offices, and help with getting things ready for the conference. There hasn’t been a lot for me to do though, I looked at some shiny demo racks, I helped load a monitor, and that’s about it. The folks at SBP are really very squared away in terms of conference preparation and are doing very well.
This has given me time to focus on recovering from jet lag, and getting to spend time talking with folks. I also have already started meeting folks I only know via email. Some of the pre-conference discussions are intriguing. But this is all pre-conference – tomorrow things actually start – and they start with a hackathon. The proposed hacking sessions actually leave me with multiple things that I want to work on. Top of mind is:
- Docs – fixing 4.2.1 release notes and working on 4.3 and beyond.
- Gluster – getting CloudStack to consume GlusterFS natively.
- KVM Agent refactoring
We also have the space til late in the night, so this won’t be a hack for a few hours and then disappear, we can keep working well into the evening.
One of the things that I didn’t like about our last in-person hackfest, was the lack of a feedback cycle. So I want to try and encourage folks holding the hackfest sessions to report what they worked on and what actually got accomplished at least to the mailing list, but hopefully also to a blog.
Should be a fun day.
Game 8 of the 2013 World Chess Championship ended with a draw. Carlsen had the white pieces, and used only 20 minutes of time for the entire game.
The match is two thirds complete, and the score is: Carlsen 5.0 - Anand 3.0
Time is running out for Anand; can he mount a challenge in the remaining games?
A few short observations from a whirlwind trip down south:
- The Santa Ynez Inn is beautiful. What a nice place to unwind.
- California is dry, dry, dry. Parched land was everywhere, dust blowing, cows being fed on hay as there is no open range grass left anywhere. Please, somebody, bring us rain, and lots of it!
- Central coast wineries have really hit the big time. There was lots and lots of delicious wine, from lots of wineries I'd never heard of, using all sorts of varieties of grapes I was totally unfamiliar with.
- Central coast wineries know they are making lots of delicious wine. At most of the wineries we visited, the low-end bottles were going for $25 to $30, and we visited multiple wineries where the regular tasting menu included wines costing $70 or more
- Fess Parker winery hasn't made sparkling wine in over a decade. Shows how long it's been since we were down that way.
- On the other hand, the Flying Goat sparkling wines were quite nice; we particularly enjoyed their Cremant
- The biggest foodie hit of the weekend was the San Marcos Farms honey, possibly available direct from San Marcos Farms but we found it at the Olive Barn in Los Olivos. In particular, the "Avocado Honey" is quite surprising. It's not avocado flavored (yuk), but rather is produced by bees who make their homes in an avocado orchard. Super!
- Lompoc, Solvang, Santa Ynez, Buellton, Los Olivos: years may have passed, but these small towns barely seem to have changed at all.
- My niece and nephew, however, are shooting up like sprouts! How quickly children grow up...
Barbara King reports for NPR that being bilingual not only opens up new worlds, but also may have mental health benefits:The largest study so far to ask whether speaking two languages might delay the onset of dementia symptoms in bilingual patients as compared to monolingual patients has reported a robust result. Bilingual patients suffer dementia onset an average of 4.5 years later than those who speak only a single language.
My school days French gets more rough with every passing year and the best I can do in German is say please and thank you. At least in Greece, I can now order coffee and such. But I’ve been wanting to go further and at least get further along, despite being in my forties and supposedly past the prime for learning new languages. If there ever was an argument for trying to get that sorted out sooner than later, this is a great one.
I shoot RAW almost exclusively and it’s tempting to simply say that everyone should do the same. But that’s not the case at all. It’d be more accurate to say that you should shoot RAW when you know that you’ll need or want the headroom that it gives you and are willing to spend the time and effort to reap its benefits. Pye Jirsa’s guide in SLR Lounge gives a bit more detail.